Back to Search Start Over

Employing deep learning and transfer learning for accurate brain tumor detection.

Authors :
Mathivanan SK
Sonaimuthu S
Murugesan S
Rajadurai H
Shivahare BD
Shah MA
Source :
Scientific reports [Sci Rep] 2024 Mar 27; Vol. 14 (1), pp. 7232. Date of Electronic Publication: 2024 Mar 27.
Publication Year :
2024

Abstract

Artificial intelligence-powered deep learning methods are being used to diagnose brain tumors with high accuracy, owing to their ability to process large amounts of data. Magnetic resonance imaging stands as the gold standard for brain tumor diagnosis using machine vision, surpassing computed tomography, ultrasound, and X-ray imaging in its effectiveness. Despite this, brain tumor diagnosis remains a challenging endeavour due to the intricate structure of the brain. This study delves into the potential of deep transfer learning architectures to elevate the accuracy of brain tumor diagnosis. Transfer learning is a machine learning technique that allows us to repurpose pre-trained models on new tasks. This can be particularly useful for medical imaging tasks, where labelled data is often scarce. Four distinct transfer learning architectures were assessed in this study: ResNet152, VGG19, DenseNet169, and MobileNetv3. The models were trained and validated on a dataset from benchmark database: Kaggle. Five-fold cross validation was adopted for training and testing. To enhance the balance of the dataset and improve the performance of the models, image enhancement techniques were applied to the data for the four categories: pituitary, normal, meningioma, and glioma. MobileNetv3 achieved the highest accuracy of 99.75%, significantly outperforming other existing methods. This demonstrates the potential of deep transfer learning architectures to revolutionize the field of brain tumor diagnosis.<br /> (© 2024. The Author(s).)

Details

Language :
English
ISSN :
2045-2322
Volume :
14
Issue :
1
Database :
MEDLINE
Journal :
Scientific reports
Publication Type :
Academic Journal
Accession number :
38538708
Full Text :
https://doi.org/10.1038/s41598-024-57970-7