Glaucoma Expert-Level Detection of Angle Closure in Goniophotographs With Convolutional Neural Networks: The Chinese American Eye Study
Abstract
Purpose: To compare the performance of a novel convolutional neural network (CNN) classifier and human graders in detecting angle closure in EyeCam (Clarity Medical Systems, Pleasanton, California, USA) goniophotographs. Design: Retrospective cross-sectional study. Methods: Subjects from the Chinese American Eye Study underwent EyeCam goniophotography in 4 angle quadrants. A CNN classifier based on the ResNet-50 architecture was trained to detect angle closure, defined as inability to visualize the pigmented trabecular meshwork, using reference labels by a single experienced glaucoma specialist. The performance of the CNN classifier was assessed using an independent test dataset and reference labels by the single glaucoma specialist or a panel of 3 glaucoma specialists. This performance was compared to that of 9 human graders with a range of clinical experience. Outcome measures included area under the receiver operating characteristic curve (AUC) metrics and Cohen kappa coefficients in the binary classification of open or closed angle. Results: The CNN classifier was developed using 29,706 open and 2,929 closed angle images. The independent test dataset was composed of 600 open and 400 closed angle images. The CNN classifier achieved excellent performance based on single-grader (AUC = 0.969) and consensus (AUC = 0.952) labels. The agreement between the CNN classifier and consensus labels (κ = 0.746) surpassed that of all non-reference human graders (κ = 0.578-0.702). Human grader agreement with consensus labels improved with clinical experience (P = 0.03). Conclusion: A CNN classifier can effectively detect angle closure in goniophotographs with performance comparable to that of an experienced glaucoma specialist. This provides an automated method to support remote detection of patients at risk for primary angle closure glaucoma.
Additional Information
© 2021 Elsevier Inc. Received 5 October 2020, Revised 20 January 2021, Accepted 3 February 2021, Available online 9 February 2021. This work was supported by grants U10 EY017337, K23 EY029763, and P30 EY029220 from the National Eye Institute, National Institutes of Health, Bethesda, Maryland, USA; a Young Clinician Scientist Research Award from the American Glaucoma Society, San Francisco, California, USA; a Grant-in-Aid Research Award from Fight for Sight, New York, New York, USA; a Clinical and Community Research Award from the Southern California Clinical and Translational Science Institute, Los Angeles, California, USA; and an unrestricted grant to the Department of Ophthalmology from Research to Prevent Blindness, New York, New York, USA. Financial Disclosures: Rohit Varma is a consultant for Allegro Inc, Allergan, and Bausch Health Companies Inc. The other authors have no financial disclosures. All authors attest that they meet the current ICMJE criteria for authorship.Attached Files
Supplemental Material - 1-s2.0-S0002939421000660-mmc1.docx
Supplemental Material - 1-s2.0-S0002939421000660-mmc2.docx
Supplemental Material - 1-s2.0-S0002939421000660-mmc3.docx
Files
Name | Size | Download all |
---|---|---|
md5:d784c838931b784063361b2ed9f49752
|
2.1 MB | Download |
md5:96994b3531901b9903bad71deb66ebcf
|
16.2 kB | Download |
md5:bbc4a5c805edf59de7e800ab2e7367b0
|
16.0 kB | Download |
Additional details
- Alternative title
- Glaucoma Expert-level Detection of Angle Closure in Goniophotographs with Convolutional Neural Networks: The Chinese American Eye Study: Automated Angle Closure Detection in Goniophotographs
- Eprint ID
- 108165
- DOI
- 10.1016/j.ajo.2021.02.004
- Resolver ID
- CaltechAUTHORS:20210224-094620349
- U10 EY017337
- NIH
- K23 EY029763
- NIH
- P30 EY029220
- NIH
- National Eye Institute
- American Glaucoma Society
- Fight for Sight
- Southern California Clinical and Translational Science Institute
- Research to Prevent Blindness
- Created
-
2021-02-24Created from EPrint's datestamp field
- Updated
-
2021-03-29Created from EPrint's last_modified field