• Therapeutic Cataract & Refractive
  • Lens Technology
  • Glasses
  • Ptosis
  • AMD
  • COVID-19
  • DME
  • Ocular Surface Disease
  • Optic Relief
  • Geographic Atrophy
  • Cornea
  • Conjunctivitis
  • LASIK
  • Myopia
  • Presbyopia
  • Allergy
  • Nutrition
  • Pediatrics
  • Retina
  • Cataract
  • Contact Lenses
  • Lid and Lash
  • Dry Eye
  • Glaucoma
  • Refractive Surgery
  • Comanagement
  • Blepharitis
  • OCT
  • Patient Care
  • Diabetic Eye Disease
  • Technology

Artificial intelligence in eye care

News
Article
Optometry Times JournalOctober digital edition 2023
Volume 15
Issue 10

The good, the bad, and the questionable.

What is artificial intelligence?

ophthalmologist medical patient. Eye clinic treatment. Hospital optics equipment. Choosing vision pain eyedrop. conjunctivitis diagnosis doctor. optometry allergy pain (Adobe Stock / elenavolf)

The concept of machines doing human tasks has been a topic of debate for decades and perhaps even centuries. (Adobe Stock / elenavolf)

The concept of machines doing human tasks has been a topic of debate for decades and perhaps even centuries. A fear among factory workers in the first half of the 20th century was that mechanization would displace the factory worker. Decades later, we see that automation has arguably bettered our lives and made society more productive. The same fear has been expressed regarding artificial intelligence (AI) in general and specifically its use in health care.

Artificial intelligence in its contemporary application traces its origins to the mid-1930s. Anyone who has seen The Imitation Game may recognize the agony of Alan Turing attempting to and eventually succeeding in cracking enemy-coded messages. His Turing machines are widely regarded as forerunners of today’s computers. What is the chore of computers but to perform complex tasks that would be burdensome for humans? How do computers accomplish their tasks? The answer is simply that they employ algorithms to funnel data to produce an answer. What is an algorithm? Simply, it is a set of rules followed by computers to digest large amounts of data and provide a conclusion (similar to a diagnosis).

It’s all in the algorithms

Taking a step back, think of the rules for simple addition or long division. In elementary school, we all learned how to do these operations by hand. Now these and much more complex calculations can be done with our handheld devices using algorithms.

We can find examples of the application of algorithms all around us. Spellchecker is an example that is useful when our fingers hit the wrong keys. But what happens when we intend something that the spell-checking algorithm disagrees with? It must be manually corrected (more on bad algorithms later). There are numerous examples of algorithms in modern automobiles. In fact, it has been suggested that there are potential carryovers from the development of autonomous vehicles to physician assistance in health care.1 Think of IBM Watson making the diagnosis.2

As algorithms evolve and are refined, they improve accuracy. Mistakes need to be identified and corrected to prevent false positive results. One personal example of a bad algorithm is my Apple Watch. Although it alerts for serious incidents such as falls, it has also alerted me when I am wearing it and try to shake sunscreen to the end of the tube (Figure 1).

Machine learning, deep learning, and convolutional neural networks

Let’s look at a case example. A 22-year-old man presents with a 2-day history of a unilateral floater. His personal and family medical histories are noncontributory. His ophthalmic history is uncomplicated, and he does not wear a refractive correction. He takes no medications and does not use any illicit drugs. Visual acuity is 20/20 in each eye without refractive correction. The anterior segment is unremarkable by slit lamp examination, and intraocular pressures are within the statistically normal range for each eye by applanation. Dilated fundus examination reveals the clinical findings recorded by color fundus photography (Figure 2). The list of differentials may include posterior vitreous detachment (not myopic), posttraumatic incidents (denies), retinal tear, and inflammatory or infectious etiologies. These findings open a new realm of questioning.

Further investigation reveals that the family had just adopted a new kitten and that the patient had a scratch on the back of his left hand. Combining history with the retinal (granuloma) and the optic disc (swelling) components, a diagnosis of neuroretinitis is made. This is a paradigm wherein feature extraction (clinical findings) are combined with classification (differential diagnoses) to produce a diagnostic conclusion. In an AI protocol, this would be at the level known as classic machine learning.3 If the fundus image were analyzed pixel by pixel, the observed clinical findings were interpreted along with the history, and a diagnostic conclusion were reached, it would be at the level known as deep learning.3 This level of clinical analysis is on the horizon and has been explored most widely in diabetic retinopathy.

The next phase for automated analysis of imaging is known as wayfinding AI.4 For example, when a patient presents with symptoms and signs suggesting central serous chorioretinopathy, optical coherence tomography (OCT) would be ordered in addition to standard color fundus photography. A wayfinding algorithm would analyze a volumetric OCT scan layer by layer to assess for subtle irregularities that may be overlooked when manually interpreting the cross-sectional line scans.

A level of algorithm beyond machine and deep learning is convolutional neural networks, all which is based on redundancies. Think about what comes up when searching a topic online. The refined redundancies remember previous searches and suggest new or additional items of potential interest. This happens to us every day.

Diabetic retinopathy as the poster child for the application of AI

Diabetic retinopathy (DR) has been the most studied among retinal disorders. There are several contributing reasons. DR is among the leading causes of vision loss. In fact, according to the Centers for Disease Control and Prevention, it is the leading cause of blindness among those of working age.5 Additionally, the prevalence of DR makes it a fertile area for research. If we take a step back and think of the classification scheme generated from findings from the Early Treatment of Diabetic Retinopathy Study (ETDRS), a few standard photographs formed the basis for classification among the stages of nonproliferative diabetic retinopathy and distinction from the proliferative stages.6,7 We all became familiar with the 4-2-1 rule.6

The Diabetic Retinopathy Severity Scale (DRSS) evolved from the ETDRS scheme and allowed for a more exact and precise specification of the level of retinopathy.8 This algorithm had a continuous scale of levels from 10 to 90, with a yes/no decision-making tree. Although it used a staging system similar to the ETDRS one, it is much more detailed regarding such items as the number, location, and significance of the vasculopathic changes of DR. The importance of such specificity was emphasized when it was invoked to demonstrate improvement in fundus appearance (level score) as well as visual acuity improvement in findings from the RISE and RIDE trials.9

With this specificity and continuous scaling, such a classification scheme could form the basis for automated image analysis. The next plateau is for automated AI to surpass human cortical decision-making and deliver perfect diagnoses at each encounter, which is a tall task. Convolutional neural networks have this capability, and Scientific American has declared that the paradigm shift to AI has become irreversible.10 Availability of a system that would accurately stage nonproliferative DR without clinician input would be invaluable from the standpoints of convenience, patient care, and consistency. In Figure 3, the patient’s nonproliferative DR worsened over the course of 13.5 months from moderate DR to moderate to severe DR (ETDRS) or from level 43 to 53 (DRSS), a more exact specification.

Just as new terminology has specified center-involving diabetic macular edema to replace the clinically significant macular edema designation from the ETDRS,11,12 the future of teleophthalmology continues to evolve. We can expect that patients will self-image their fundi, perhaps with OCT-angiography, whose images will be transmitted to a reading center or an eye care provider for interpretation and decision-making. The intervention of technology and the disruption by the pandemic have intersected to offer interesting innovations.13,14 These forces will drive convenient, safe, effective, and equitably applied eye care for preserving visual function among patients with DR.15,16

The global pandemic has hastened the use of virtual visits. For example, the Mayo Clinic in Jacksonville, Florida, set a strategic benchmark at the beginning of 2020 of having 30% of eligible visits conducted online by 2030. By the beginning of 2022, 60% of eligible visits were being conducted online (Klaas J. Wierenga, MD, personal communication, March 8, 2022.)

Screening and virtual ophthalmic evaluations have been deployed in the public space17 and clinical space. Although prototypes may have had a rocky start, identification of those patients at greatest risk for disease are now separated from those who are distinctly disease free.18 Interestingly, in this algorithm, eyes with macular edema were identified with groups at greatest risk for vision loss or in need of referral for surgical interventions. One of my local big box pharmacies offers diabetic eye examinations for those who qualify (Figure 4). The inclusion criteria for fundus imaging include exceeding thresholds for random blood glucose, HbA1C, or being treated for type 2 diabetes.

AI is all around us and is here to stay in health care. It has made inroads in the ophthalmic space, which will continue to expand. AI will be transformative for medicine by allowing automated analysis of telemetrically gathered data. This will be of great advantage in all clinical settings and will eventually lead to the no-touch patient examination. Bad algorithms will be weeded out, and nearly seamless diagnostic conclusions will result.

References
1. Norden JG, Shah NR. What AI in health care can learn from the long road to autonomous vehicles. NEJM Catalyst. Accessed August 11, 2023. https://catalyst.nejm.org/doi/pdf/10.1056/CAT.21.0458
2. A guide to the medical diagnostic and treatment algorithm used by IBM’s Watson computer system. Explain xkcd. 2015. Accessed August 11, 2023. https://www.explainxkcd.com/wiki/images/1/15/watson_medical_algorithm.png
3. Schmidt-Erfurth U, Sadeghipour A, Gerendas BS, Waldstein SM, Bogunović H. Artificial intelligence in retina. Prog Retin Eye Res. 2018;67:1-29. doi:10.1016/j.preteyeres.2018.07.004
4. Wayfinding AI: a new way to detect retinal disease. Retina Today. Accessed August 11, 2023. https://retinatoday.com/articles/2023-apr/wayfinding-ai-a-new-way-to-detect-retinal-disease?c4src=issue:feed
5. Diabetes and vision loss. Centers for Disease Control and Prevention. Accessed August 11, 2023. https://www.cdc.gov/diabetes/managing/diabetes-vision-loss.html
6. Grading diabetic retinopathy from stereoscopic color fundus photographs--an extension of the modified Airlie House classification: ETDRS report number 10: early treatment diabetic retinopathy study research group. Ophthalmology. 1991;98(suppl 5):786-806.
7. Classification of diabetic retinopathy from fluorescein angiograms: ETDRS report number 11: early treatment diabetic retinopathy study research group. Ophthalmology. 1991;98(suppl 5):807-822.
8. Wilkinson CP, Ferris FL III, Klein RE, et al; Global Diabetic Retinopathy Project Group. Proposed international clinical diabetic retinopathy and diabetic macular edema disease severity scales. Ophthalmology. 2003;110(9):1677-1682. doi:10.1016/S0161-6420(03)00475-5
9. Ip MS, Zhang J, Ehrlich JS. The clinical importance of changes in diabetic retinopathy severity score. Ophthalmology. 2017;124(5):596-603. doi:10.1016/j.ophtha.2017.01.003
10. Scientific American. December 1, 2015. Accessed August 11, 2023. https://www.scientificamerican.com/issue/sa/2015/12-01/
11. Danis RP, Glassman AR, Aiello LP, et al; Diabetic Retinopathy Clinical Research Network. Diurnal variation in retinal thickening measurement by optical coherence tomography in center-involved diabetic macular edema. Arch Ophthalmol. 2006;124(12):1701-1707. doi:10.1001/archopht.124.12.1701
12. Brownlee M, Aiello LP, Cooper ME, Vinik AI, Plutzky J, Boulton AJM. Complications of diabetes mellitus. In: Melmed S, Polonsky KS, Larsen PR, Kronenberg HM, eds. Williams Textbook of Endocrinology. 13th ed. Elsevier; 2016:1484-1581. https://www.sciencedirect.com/science/article/abs/pii/B9780323297387000332?via%3Dihub
13. Leung EH, Fan J, Flynn HW Jr, Albini TA. Ocular and systemic complications of COVID-19: impact on patients and healthcare. Clin Ophthalmol. 2022;16:1-13. doi:10.2147/OPTH.S336963
14. Valentim CCS, Muste JC, Iyer AI, et al. Characterization of ophthalmology virtual visits during the COVID-19 pandemic. Eye (Lond). 2023;37(2):332-337. doi:10.1038/s41433-022-01938-2
15. Gilchrist J. Analysis of early diabetic retinopathy by computer processing of fundus images--a preliminary study. Ophthalmic Physiol Opt. 1987;7(4):393-399
16. Channa R, Wolf R, Abramoff MD. Autonomous artificial intelligence in diabetic retinopathy: from algorithm to clinical application. J Diabetes Sci Technol. 2021;15(3):695-698. doi:10.1177/1932296820909900
17. Ogunyemi O, George S, Patty L, Teklehaimanot S, Baker R. Teleretinal screening for diabetic retinopathy in six Los Angeles urban safety-net clinics: final study results. AMIA Annu Symp Proc. 2013;2013:1082-1088.
18. Abràmoff MD, Lou Y, Erginay A, et al. Improved automated detection of diabetic retinopathy on a publicly available dataset through integration of deep learning. Invest Ophthalmol Vis Sci. 2016;57(13):5200-5206. doi:10.1167/iovs.16-19964
Related Videos
Charles Leclercq, CEO of ARxVision, details the new ARx AI headset and its uses
Danica Marrelli, OD, FAAO, AAO Dipl, co-chair of EnVision Summit chats about geographic atrophy and glaucoma panels
Steven Ferrucci, OD, FAAO - AAOpt
Easy Anyama
© 2024 MJH Life Sciences

All rights reserved.