Recent data from two research endeavors highlights the growing evidence to support the use of artificial intelligence (AI) for ocular disease screenings, this time for glaucomatous optic neuropathy (GON) and diabetic retinopathy (DR).1,2 Yet another study found a way to sidestep time-consuming classification of age-related macular degeneration (AMD) images using AI.3 With the recent FDA approval of IDx-DR (IDx, LLC)—the first medical device that uses AI to detect “greater than mild” DR—AI technology is closer than ever to clinical practice.4
Go Small or Go Home
A new study even puts AI capabilities for DR in the palm of your hand. Researchers screened 296 patients in India using a smartphone-based fundus photography platform and both a validated AI DR screening software and manual grading by ophthalmologists. They found the AI software had a 95.8% sensitivity and 80.2% specificity for detecting any DR and 99.1% sensitivity and 80.4% specificity in detecting sight-threatening DR (STDR).1 The AI software “replicates or exceeds the diagnostic interpretations by ophthalmologists and retina specialists,” said Louis J. Catania OD, in a commentary for Practice Update.5
In addition, the smartphone-based platform “is essentially equivalent to the AI-analyzed conventional retinal imaging reported in previous studies,” noted Dr. Catania. Using AI on a phone “can provide unique benefits to primary healthcare and eye care practitioners and to the public health as a validated, highly accurate, accessible (cost-effective, telemedicine capable and portable) screening and early detection system for STDR,” he wrote.5
Glaucoma Up Close
Other researchers tackled GON screening with AI, and had similarly positive results. They asked 21 trained ophthalmologists to classify 48,119 fundus photographs for referable GON, defined as vertical cup-to-disc ratio of 0.7 or more, in addition to other typical changes of the condition. After running the graded images through a deep learning algorithm, they found the AI software achieved a sensitivity of 95.6% and specificity of 92.0% in detecting referable GON.2
The investigators found coexisting high or pathologic myopia was the most common cause of false-negative results, while physiologic cupping and pathologic myopia were the most common culprits for false-positive results. Still, the results show promise for AI in screening for this condition, the authors concluded.2
Classifying AMD in the Blink of an Eye
Manually grading fundus images for AMD is an in-depth process that takes considerable time—unless you use deep learning tools, a new study found. German researchers used 120,656 manually graded color fundus images from the Age-Related Eye Disease Study (AREDS) and 5,555 fundus images from their country’s Kooperative Gesundheitsforschung in der Region Augsburg study to build an AI grading tool for distinguishing 13 different AMD classes. They found the AI system predicted the 13 classes with a 95% confidence interval, and, even more importantly, it detected 84.2% of all fundus images with definite signs of early or late AMD. Overall, the system classified 94.3% of healthy fundus images correctly.3
“Our deep learning algorithm revealed a weighted κ [a statistical measure of reliability in classification methods] outperforming human graders in the AREDS study and is suitable to classify AMD fundus images in other datasets using individuals >55 years of age,” the study authors said.
1. Rajalakshmi R, Subashini R, Anjana RM, Mohan V. Automated diabetic retinopathy detection in smartphone-based fundus photography using artificial intelligence. Eye (Lond). March 9, 2018. [Epub ahead of print].