close
close
Artificial intelligence predicts tongue diseases with 96 percent accuracy

In a recent study published in Technologies, researchers developed a novel system that uses machine learning to predict tongue diseases.

Artificial intelligence predicts tongue diseases with 96 percent accuracy​​​​​​​Study: Prediction of tongue diseases based on machine learning algorithmsImage credit: fizkes/Shutterstock.com

background

The traditional diagnosis of tongue disease is based on observing tongue characteristics such as color, shape, texture and moisture, which provide information about the state of health.

Practitioners of Traditional Chinese Medicine (TCM) rely on subjective assessments of tongue characteristics, leading to subjectivity in diagnosis and replication problems. The rise of artificial intelligence (AI) has created a strong demand for breakthroughs in tongue diagnosis technology.

Automated tongue color analysis systems have demonstrated high accuracy in identifying healthy and diseased individuals and diagnosing various medical conditions. Artificial intelligence has made tremendous progress in capturing, analyzing, and categorizing tongue images.

The convergence of artificial intelligence approaches in tongue diagnostic research aims to increase reliability and accuracy while addressing the long-term prospects for large-scale AI applications in healthcare.

About the study

The present study proposes a novel machine learning-based imaging system to analyze and extract tongue color features at different color saturations and under different lighting conditions for real-time tongue color analysis and disease prediction.

The imaging system trained tongue images classified by color using six machine learning algorithms to predict tongue color. The algorithms included Support Vector Machines (SVM), Naive Bayes (NB), Decision Trees (DTs), K-Nearest Neighbors (KNN), Extreme Gradient Boost (XGBoost), and Random Forest (RF) classifiers.

The color models were as follows: the human visual system (HSV), the red, green and blue system (RGB), the separation of luminance and chrominance (YCbCr, YIQ) and the brightness with green-red and blue-yellow axes (LAB).

The researchers split the data into training sets (80%) and test sets (20%). The training set included 5,260 images classified as yellow (n=1,010), red (n=1,102), blue (n=1,024), green (n=945), pink (n=310), white (n=300), and gray (n=737) for different lighting conditions and saturations.

The second group included 60 images of pathological tongues from Mosul General Hospital and Al-Hussein Hospital in Iraq. They showed individuals with various conditions such as diabetes, asthma, fungal infections, kidney failure, COVID-19, anemia and fungal papillae.

The patients sat 20 cm away from the camera while the machine learning algorithm recognized the color of their tongue and predicted their health status in real time.

The researchers used laptops with the MATLAB App Designer program installed and webcams with a resolution of 1,920 x 1,080 pixels to extract color and features of the tongue. Image analysis included segmenting the central region of the tongue image and eliminating the mustache, beard, lips, and teeth for analysis.

After image analysis, the system converted the RGB space into HVS, YCbCr, YIQ and LAB models. After color classification, the intensities from different color channels were fed to different machine learning algorithms to train the image model.

Performance evaluation measures included precision, accuracy, recall, Jaccard index, F1 scores, G scores, zero-one losses, Cohen’s kappa, Hamming loss, Fowlkes-Mallow index, and Matthews correlation coefficient (MCC).

Results

The results showed that XGBoost was the most accurate (98.7%), while the Nave-Bayes technique had the lowest accuracy (91%). For XGBoost, F1 scores of 98% represented an excellent balance between recall and precision.

The Jaccard index of 0.99 with 0.01 zero-one losses, 0.92 G-score, 0.01 Hamming loss, 1.0 Cohen’s kappa, 0.4 MCC and 0.98 Fowlkes-Mallow index indicated almost perfect positive correlations, indicating that XGBoost is highly reliable and effective for tongue analysis. XGBoost ranked first in precision, accuracy, F1-score, recall and MCC.

Based on these findings, the researchers used XGBoost as the algorithm for the proposed tongue imaging tool, which is linked to a graphical user interface and predicts tongue color and related disorders in real time.

The imaging system delivered positive results after deployment. The machine learning-based system recognized 58 out of 60 tongue images with a recognition accuracy of 96.6%.

A pink tongue indicates good health, but other shades indicate disease. Patients with yellow tongues were classified as diabetic, while those with green tongues were diagnosed with fungal diseases.

A blue tongue indicates asthma, a red-colored tongue indicates coronavirus disease 2019 (COVID-19), a black tongue indicates the presence of fungiform papillae, and a white tongue indicates anemia.

Conclusions

Overall, the real-time imaging system with XGBoost delivered positive results upon deployment with a diagnostic accuracy of 96.6%. These findings support the viability of artificial intelligence systems for tongue recognition in medical applications and demonstrate that this method is safe, efficient, user-friendly, enjoyable and cost-effective.

Camera reflections can lead to differences in the observed colors, thus affecting the diagnosis. Future studies should take camera reflections into account and use powerful image processors, filters, and deep learning approaches to increase accuracy. This method paves the way for advanced tongue diagnostics in future point-of-care healthcare systems.

By Bronte

Leave a Reply

Your email address will not be published. Required fields are marked *