1. Offloading the computational complexity of transfer learning with generic features.
- Author
-
Khan, Muhammad Safdar Ali, Husen, Arif, Nisar, Shafaq, Ahmed, Hasnain, Muhammad, Syed Shah, and Aftab, Shabib
- Subjects
COMPUTATIONAL complexity ,TIME complexity ,DEEP learning ,DATABASES ,EARLY detection of cancer ,BREAST imaging - Abstract
Deep learning approaches are generally complex, requiring extensive computational resources and having high time complexity. Transfer learning is a state-of-the-art approach to reducing the requirements of high computational resources by using pretrained models without compromising accuracy and performance. In conventional studies, pre-trained models are trained on datasets from different but similar domains with many domain-specific features. The computational requirements of transfer learning are directly dependent on the number of features that include the domain-specific and the generic features. This article investigates the prospects of reducing the computational requirements of the transfer learning models by discarding domain-specific features from a pre-trained model. The approach is applied to breast cancer detection using the dataset curated breast imaging subset of the digital database for screening mammography and various performance metrics such as precision, accuracy, recall, F1-score, and computational requirements. It is seen that discarding the domain-specific features to a specific limit provides significant performance improvements as well as minimizes the computational requirements in terms of training time (reduced by approx. 12%), processor utilization (reduced approx. 25%), and memory usage (reduced approx. 22%). The proposed transfer learning strategy increases accuracy (approx. 7%) and offloads computational complexity expeditiously. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF