Back to Search Start Over

Skin Cancer Classification Using Fine-Tuned Transfer Learning of DENSENET-121.

Authors :
Bello, Abayomi
Ng, Sin-Chun
Leung, Man-Fai
Source :
Applied Sciences (2076-3417); Sep2024, Vol. 14 Issue 17, p7707, 25p
Publication Year :
2024

Abstract

Skin cancer diagnosis greatly benefits from advanced machine learning techniques, particularly fine-tuned deep learning models. In our research, we explored the impact of traditional machine learning and fine-tuned deep learning approaches on prediction accuracy. Our findings reveal significant improvements in predictability and accuracy with fine-tuning, particularly evident in deep learning models. The CNN, SVM, and Random Forest Classifier achieved high accuracy. However, fine-tuned deep learning models such as EfficientNetB0, ResNet34, VGG16, Inception _v3, and DenseNet121 demonstrated superior performance. To ensure comparability, we fine-tuned these models by incorporating additional layers, including one flatten layer and three densely interconnected layers. These layers play a crucial role in enhancing model efficiency and performance. The flatten layer preprocesses multidimensional feature maps, facilitating efficient information flow, while subsequent dense layers refine feature representations, capturing intricate patterns and relationships within the data. Leveraging LeakyReLU activation functions in the dense layers mitigates the vanishing gradient problem and promotes stable training. Finally, the output dense layer with a sigmoid activation function simplifies decision making for healthcare professionals by providing binary classification output. Our study underscores the significance of incorporating additional layers in fine-tuned neural network models for skin cancer classification, offering improved accuracy and reliability in diagnosis. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20763417
Volume :
14
Issue :
17
Database :
Complementary Index
Journal :
Applied Sciences (2076-3417)
Publication Type :
Academic Journal
Accession number :
179650230
Full Text :
https://doi.org/10.3390/app14177707