Skip to content
2000
Volume 21, Issue 1
  • ISSN: 1573-4056
  • E-ISSN: 1875-6603

Abstract

Objective

Glaucoma is a leading cause of irreversible visual impairment and blindness worldwide, primarily linked to increased intraocular pressure (IOP). Early detection is essential to prevent further visual impairment, yet the manual diagnosis of retinal fundus images (RFIs) is both time-consuming and inefficient. Although automated methods for glaucoma detection (GD) exist, they often rely on individual models with manually optimized hyperparameters. This study aims to address these limitations by proposing an ensemble-based approach that integrates multiple deep learning (DL) models with automated hyperparameter optimization, with the goal of improving diagnostic accuracy and enhancing model generalization for practical clinical applications.

Materials and Methods

The RFIs used in this study were sourced from two publicly available datasets (ACRIMA and ORIGA), consisting of a total of 1,355 images for GD. Our method combines a custom Multi-branch convolutional neural network (CNN), pretrained MobileNet, and DenseNet201 to extract complementary features from RFIs. Moreover, to optimize model performance, we apply Bayesian Optimization (BO) for automated hyperparameter tuning, eliminating the need for manual adjustments. The predictions from these models are then combined using a Dirichlet-based Weighted Average Ensemble (Dirichlet-WAE), which adaptively adjusts the weight of each model based on its performance.

Results

The proposed ensemble model demonstrated state-of-the-art performance, achieving an accuracy (ACC) of 95.09%, precision (PREC) of 95.51%, sensitivity (SEN) of 94.55%, an F1-score (F1) of 94.94%, and an area under the curve (AUC) of 0.9854. The innovative Dirichlet-based WAE substantially reduced the false positive rate, enhancing diagnostic reliability for GD. When compared to individual models, the ensemble method consistently outperformed across all evaluation metrics, underscoring its robustness and potential scalability for clinical applications.

Conclusion

The integration of ensemble learning (EL) and advanced optimization techniques significantly improved the ACC of GD in RFIs. The enhanced WAE method proved to be a critical factor in achieving well-balanced and highly accurate diagnostic performance, underscoring the importance of EL in medical diagnosis.

© 2025 The Author(s). Published by Bentham Science Publishers. This is an open access article published under CC BY 4.0 https://creativecommons.org/licenses/by/4.0/legalcode
Loading

Article metrics loading...

/content/journals/cmir/10.2174/0115734056335762250128095107
2025-01-01
2025-10-25
Loading full text...

Full text loading...

/deliver/fulltext/cmir/21/1/CMIR-21-E15734056335762.html?itemId=/content/journals/cmir/10.2174/0115734056335762250128095107&mimeType=html&fmt=ahah

References

  1. BourneR.R.A. StevensG.A. WhiteR.A. SmithJ.L. FlaxmanS.R. PriceH. JonasJ.B. KeeffeJ. LeasherJ. NaidooK. PesudovsK. ResnikoffS. TaylorH.R. Vision Loss Expert GroupCauses of vision loss worldwide, 1990–2010: A systematic analysis.Lancet Glob. Health201316e339e34910.1016/S2214‑109X(13)70113‑X25104599
    [Google Scholar]
  2. BeedeE. BaylorE. HerschF. A human-centered evaluation of a deep learning system deployed in clinics for the detection of diabetic retinopathy.Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems202011210.1145/3313831.3376718
    [Google Scholar]
  3. PoplinR. VaradarajanA.V. BlumerK. LiuY. McConnellM.V. CorradoG.S. PengL. WebsterD.R. Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning.Nat. Biomed. Eng.20182315816410.1038/s41551‑018‑0195‑031015713
    [Google Scholar]
  4. JampelH.D. FriedmanD. QuigleyH. VitaleS. MillerR. KnezevichF. DingY. Agreement among glaucoma specialists in assessing progressive disc changes from photographs in open-angle glaucoma patients.Am. J. Ophthalmol.200914713944.e110.1016/j.ajo.2008.07.02318790472
    [Google Scholar]
  5. SahaS.K. FernandoB. CuadrosJ. XiaoD. KanagasingamY. Automated quality assessment of colour fundus images for diabetic retinopathy screening in telemedicine.J. Digit. Imaging201831686987810.1007/s10278‑018‑0084‑929704086
    [Google Scholar]
  6. SenguptaS. BasakS. SaikiaP. PaulS. TsalavoutisV. AtiahF. RaviV. PetersA. A review of deep learning with special emphasis on architectures, applications and recent trends.Knowl. Base. Syst.202019410559610.1016/j.knosys.2020.105596
    [Google Scholar]
  7. YamashitaR. NishioM. DoR.K.G. TogashiK. Convolutional neural networks: an overview and application in radiology.Insights Imaging20189461162910.1007/s13244‑018‑0639‑929934920
    [Google Scholar]
  8. RussakovskyO. DengJ. SuH. KrauseJ. SatheeshS. MaS. HuangZ. KarpathyA. KhoslaA. BernsteinM. BergA.C. Fei-FeiL. ImageNet large scale visual recognition challenge.Int. J. Comput. Vis.2015115321125210.1007/s11263‑015‑0816‑y
    [Google Scholar]
  9. IslamMd. M. HannanT. SarkerL. AhmedZ. COVID-denseNet: A deep learning architecture to detect COVID-19 from chest radiology images.Proceedings of International Conference on Data Science and Applications, ICDSA2022239741510.1007/978‑981‑19‑6634‑7_28
    [Google Scholar]
  10. YangL. ShamiA. On hyperparameter optimization of machine learning algorithms: Theory and practice.Neurocomputing202041529531610.1016/j.neucom.2020.07.061
    [Google Scholar]
  11. FarhangiF. Investigating the role of data preprocessing, hyperparameters tuning, and type of machine learning algorithm in the improvement of drowsy EEG signal modeling.Intelligent Systems with Applications20221520010010.1016/j.iswa.2022.200100
    [Google Scholar]
  12. WangX. JinY. SchmittS. OlhoferM. Recent advances in bayesian optimization.ACM Comput. Surv.20235513s13610.1145/3582078
    [Google Scholar]
  13. UllahM.S. KhanM.A. MasoodA. MzoughiO. SaidaniO. AlturkiN. Brain tumor classification from MRI scans: A framework of hybrid deep learning model with Bayesian optimization and quantum theory-based marine predator algorithm.Front. Oncol.202414133574010.3389/fonc.2024.133574038390266
    [Google Scholar]
  14. SagiO. RokachL. Ensemble learning: A survey.Wiley Interdiscip. Rev. Data Min. Knowl. Discov.201884e124910.1002/widm.1249
    [Google Scholar]
  15. RokachL. Pattern classification using ensemble methods.Series in Machine Perception and Artificial Intelligence: Volume 75200924410.1142/7238
    [Google Scholar]
  16. ZhouZ-H. Ensemble methods: Foundations and algorithms.Chapman and Hall/CRC20121st ed.10.1201/b12207
    [Google Scholar]
  17. AgrawalD.K. KirarB.S. PachoriR.B. Automated glaucoma detection using quasi-bivariate variational mode decomposition from fundus images.IET Image Process.201913132401240810.1049/iet‑ipr.2019.0036
    [Google Scholar]
  18. KirarB.S. AgrawalD.K. Computer aided diagnosis of glaucoma using discrete and empirical wavelet transform from fundus images.IET Image Process.2019131738210.1049/iet‑ipr.2018.5297
    [Google Scholar]
  19. MaheshwariS. PachoriR.B. KanhangadV. BhandaryS.V. AcharyaU.R. Iterative variational mode decomposition based automated detection of glaucoma using fundus images.Comput. Biol. Med.20178814214910.1016/j.compbiomed.2017.06.01728728059
    [Google Scholar]
  20. BajwaM.N. BajwaM.N. MalikM.I. Two-stage framework for optic disc localization and glaucoma classification in retinal fundus images using deep learning.BMC Med. Inform. Decis. Mak.201919111630616584
    [Google Scholar]
  21. D’SouzaG. SiddalingaswamyP.C. PandyaM.A. AlterNet-K: a small and compact model for the detection of glaucoma.Biomed. Eng. Lett.2024141233310.1007/s13534‑023‑00307‑638186944
    [Google Scholar]
  22. JunejaM. ThakurS. UniyalA. WaniA. ThakurN. JindalP. Deep learning-based classification network for glaucoma in retinal images.Comput. Electr. Eng.202210110800910.1016/j.compeleceng.2022.108009
    [Google Scholar]
  23. LiaoW. ZouB. ZhaoR. ChenY. HeZ. ZhouM. Clinical interpretable deep learning model for glaucoma diagnosis.IEEE J. Biomed. Health Inform.20202451405141210.1109/JBHI.2019.294907531647449
    [Google Scholar]
  24. LeonardoR. GonçalvesJ. CarreiroA. SimõesB. OliveiraT. SoaresF. Impact of generative modeling for fundus image augmentation with improved and degraded quality in the classification of glaucoma.IEEE Access20221011163611164910.1109/ACCESS.2022.3215126
    [Google Scholar]
  25. AlmansourA. AlawadM. AljouieA. AlmatarH. QureshiW. AlabdulkaderB. AlkanhalN. AbdulW. AlmufarrejM. GangadharanS. AldebasiT. AlsomaieB. AlmazroaA. Peripapillary atrophy classification using CNN deep learning for glaucoma screening.PLoS One20221710e027544610.1371/journal.pone.027544636201448
    [Google Scholar]
  26. MartinsJ. CardosoJ.S. SoaresF. Offline computer-aided diagnosis for Glaucoma detection using fundus images targeted at mobile devices.Comput. Methods Programs Biomed.202019210534110.1016/j.cmpb.2020.10534132155534
    [Google Scholar]
  27. ChoH. HwangY.H. ChungJ.K. LeeK.B. ParkJ.S. KimH.G. JeongJ.H. Deep learning ensemble method for classifying glaucoma stages using fundus photographs and convolutional neural networks.Curr. Eye Res.202146101516152410.1080/02713683.2021.190026833820457
    [Google Scholar]
  28. SerteS. SerenerA. Graph-based saliency and ensembles of convolutional neural networks for glaucoma detection.IET Image Process.202115379780410.1049/ipr2.12063
    [Google Scholar]
  29. Diaz-PintoA. MoralesS. NaranjoV. KöhlerT. MossiJ.M. NaveaA. CNNs for automatic glaucoma assessment using fundus images: An extensive validation.Biomed. Eng. Online20191812910.1186/s12938‑019‑0649‑y30894178
    [Google Scholar]
  30. ZhangZ. FengS.Y. LiuJ. ORIGA(-light): An online retinal fundus image database for glaucoma analysis and research.Annu Int Conf IEEE Eng Med Biol Soc201020103065810.1109/IEMBS.2010.562613721095735
    [Google Scholar]
  31. OrlandoJ.I. ProkofyevaE. Del FresnoM. BlaschkoM.B. Convolutional neural network transfer for automated glaucoma identification.Proceedings of SPIE - The International Society for Optical Engineering2017101601010.1117/12.2255740
    [Google Scholar]
  32. KrizhevskyA. SutskeverI. HintonG.E. ImageNet classification with deep convolutional neural networks.Commun. ACM2017606849010.1145/3065386
    [Google Scholar]
  33. HutchisonD. KanadeT. KittlerJ. Evaluation of Pooling Operations in Convolutional Architectures for Object Recognition.Artificial Neural Networks – ICANN 2010. DiamantarasK. DuchW. IliadisL.S. Berlin, HeidelbergSpringer Berlin Heidelberg201063549210110.1007/978‑3‑642‑15825‑4_10
    [Google Scholar]
  34. IoffeS. SzegedyC. Batch normalization: Accelerating deep network training by reducing internal covariate shift.Mach Learn.20153316710.48550/arXiv.1502.03167
    [Google Scholar]
  35. HowardAG ZhuM ChenB MobileNets: Efficient convolutional neural networks for mobile vision applications.Comput. Vision Pattern Recognit.20171486110.48550/arXiv.1704.0
    [Google Scholar]
  36. YosinskiJ. CluneJ. BengioY. LipsonH. How transferable are features in deep neural networks?Mach Learn.20141179210.48550/arXiv.1411.1792
    [Google Scholar]
  37. HuangG. LiuZ. Van Der MaatenL. WeinbergerK.Q. Densely connected convolutional networks.2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)2017Honolulu, HI, USA22612269
    [Google Scholar]
  38. BetròB. Bayesian methods in global optimization.J. Glob. Optim.19911111410.1007/BF00120661
    [Google Scholar]
  39. JonesD.R. SchonlauM. WelchW.J. Efficient global optimization of expensive black-box functions.J. Glob. Optim.199813445549210.1023/A:1008306431147
    [Google Scholar]
  40. KatonaT. TóthG. PetróM. HarangiB. Developing new fully connected layers for convolutional neural networks with hyperparameter optimization for improved multi-label image classification.Mathematics202412680610.3390/math12060806
    [Google Scholar]
  41. SrivastavaN. HintonG. KrizhevskyA. SutskeverI. SalakhutdinovR. Dropout: A simple way to prevent neural networks from overfitting.J. Mach. Learn. Res.201415119291958
    [Google Scholar]
  42. LiS. ZhouM. LuoX. YouZ.H. Distributed winner-take-all in dynamic networks.IEEE Trans. Automat. Contr.201762257758910.1109/TAC.2016.2578645
    [Google Scholar]
  43. LiS. GuoY. Discrete-time consensus filters for average tracking of time-varying inputs on directed switching graphs.Asian J. Control201820291993410.1002/asjc.1586
    [Google Scholar]
  44. LinM. ChenQ. YanS. Network in network.Neural. Evol. Comput.20133440010.48550/arXiv.1312.4400
    [Google Scholar]
  45. GlorotX. BordesA. BengioY. Deep sparse rectifier neural networks.Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR201115315323
    [Google Scholar]
  46. ClevertD.A. UnterthinerT. HochreiterS Fast and accurate deep network learning by exponential linear units (ELUs)Mach. Learn.20155728910.48550/arXiv.1511.07289
    [Google Scholar]
  47. LeCunY. BottouL. OrrG.B. MüllerK.R. Efficient backprop.Neural networks: Tricks of the trade.Berlin, HeidelbergSpringer2002950
    [Google Scholar]
  48. RamachandranP. ZophB. LeQ.V Searching for activation functionsNeural Evol. Comput.20172594110.48550/arXiv.1710.05941
    [Google Scholar]
  49. DugasC. BengioY. BélisleF. NadeauC. GarciaR. Incorporating second-order functional knowledge for better option pricing.Adv. Neural Inf. Process. Syst.200013487493
    [Google Scholar]
  50. ChandraK. XieA. Ragan-KelleyJ. MeijerE. Gradient descent: the ultimate optimizer.Adv. Neural Inf. Process. Syst.20223582148225
    [Google Scholar]
  51. AndrychowiczM. DenilM. GomezS. HoffmanM.W. PfauD. SchaulT. Learning to learn by gradient descent by gradient descent.Adv. Neural Inf. Process. Syst.20162939813989
    [Google Scholar]
  52. ZhangJ Gradient descent-based optimization algorithms for deep learning models trainingMach. Learn.20191361410.48550/arXiv.1903.03614
    [Google Scholar]
  53. SunR.Y. Optimization for deep learning: an overview.J. Oper. Res. Soc. China20208224929410.1007/s40305‑020‑00309‑6
    [Google Scholar]
  54. DozatT. Incorporating Nesterov momentum into Adam.Proceedings of the 4th International Conference on Learning Representations, Workshop Track2016San Juan, Puerto Rico14
    [Google Scholar]
  55. MukkamalaM.C. HeinM. Variants of RMSProp and Adagrad with logarithmic regret bounds.Mach. Learn.20172550710.48550/arXiv.1706.05507
    [Google Scholar]
  56. KingmaD.P. BaJ Adam: A method for stochastic optimization.Mach. Learn.20149698010.48550/arXiv.1412.6980
    [Google Scholar]
  57. MustaphaA. MohamedL. AliK. An overview of gradient descent algorithm optimization in machine learning: Application in the ophthalmology field.Commun. Comput. Inf. Sci.2020120734935910.1007/978‑3‑030‑45183‑7_27
    [Google Scholar]
  58. RuderS An overview of gradient descent optimization algorithms.Mach. Learn.20167474710.48550/arXiv.1609.04747
    [Google Scholar]
  59. DietterichT.G. Ensemble methods in machine learning.International workshop on multiple classifier systemsBerlin, HeidelbergSpringer2000115
    [Google Scholar]
  60. NgK.W. TianG. TangM. Dirichlet and related distributions.Wiley Ser Probab Stat201110.1002/9781119995784
    [Google Scholar]
  61. GrandiniM. BagliE. VisaniG Metrics for multi-class classification: An overviewMach. Learn.20201575610.48550/arXiv.2008.05756
    [Google Scholar]
  62. SokolovaM. LapalmeG. A systematic analysis of performance measures for classification tasks.Inf. Process. Manage.200945442743710.1016/j.ipm.2009.03.002
    [Google Scholar]
  63. PowersD.M Evaluation: From precision, recall and F-measure to ROC, informedness, markedness and correlation.Mach. Learn.20201606110.48550/arXiv.2010.16061
    [Google Scholar]
  64. ZhangZ. Introduction to machine learning: K-nearest neighbors.Ann. Transl. Med.201641121810.21037/atm.2016.03.3727386492
    [Google Scholar]
  65. BisongE. Building machine learning and deep learning models on Google cloud platform.Berkeley, CAApress2019596410.1007/978‑1‑4842‑4470‑8_7
    [Google Scholar]
  66. AbadiM. BarhamP. ChenJ. ChenZ. DavisA. DeanJ. TensorFlow: A system for large-scale machine learning.Distrib. Parallel. Cluster Comput.20162869510.48550/arXiv.1605.08695
    [Google Scholar]
  67. CholletF. Keras.2015Avaailable from: https://github.com/fchollet/keras
/content/journals/cmir/10.2174/0115734056335762250128095107
Loading
/content/journals/cmir/10.2174/0115734056335762250128095107
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error
Please enter a valid_number test