This AI can diagnose stroke just by analyzing tongue color
Researchers have developed an advanced computer algorithm that can diagnose diseases by analyzing the color of a person's tongue. The algorithm was developed by a team from Middle Technical University (MTU) and the University of South Australia (UniSA). The innovative technology, which has demonstrated an impressive 98% accuracy rate, is capable of identifying conditions like anaemia, asthma, diabetes, stroke, liver and gallbladder issues, COVID-19, and other vascular and gastrointestinal diseases.
Tongue characteristics: A window into health conditions
Ali Al-Naji, an adjunct Associate Professor at MTU and UniSA, explained that the shape, color, and thickness of the tongue can reveal a variety of health conditions. For instance, a yellow tongue is often seen in people with diabetes, while cancer patients may have a purple tongue with a thick greasy coating. Acute stroke patients may have an unusually shaped red tongue, while a white tongue can indicate anaemia.
AI model's training and validation process
The algorithm was trained using 5,260 pictures to detect variations in tongue color. An additional 60 images from two teaching hospitals in the Middle East were used to validate the model. These pictures represented patients with diverse health conditions. The AI model matched tongue color with the right disease in nearly all cases, demonstrating its high accuracy rate.
Real-time diagnosis and potential smartphone adaptation
The system analyzes tongue color to provide real-time diagnoses, a process that mirrors a 2,000-year-old technique from traditional Chinese medicine. Co-author and UniSA Professor Javaan Chahl suggested that this technology could eventually be adapted for use with smartphones. "These results confirm that computerized tongue analysis is a secure, efficient, user-friendly, and affordable method for disease screening," Chahl said.
Challenges and future prospects
Despite its potential, the technology faces challenges like patient reluctance to provide tongue images and camera reflections affecting accuracy. However, these hurdles do not diminish the promise it holds as a tool for real-time health assessment and disease screening.