Back to Search Start Over

Effective training of convolutional neural networks for age estimation based on knowledge distillation.

Authors :
Greco, Antonio
Saggese, Alessia
Vento, Mario
Vigilante, Vincenzo
Source :
Neural Computing & Applications. Dec2022, Vol. 34 Issue 24, p21449-21464. 16p.
Publication Year :
2022

Abstract

Age estimation from face images can be profitably employed in several applications, ranging from digital signage to social robotics, from business intelligence to access control. Only in recent years, the advent of deep learning allowed for the design of extremely accurate methods based on convolutional neural networks (CNNs) that achieve a remarkable performance in various face analysis tasks. However, these networks are not always applicable in real scenarios, due to both time and resource constraints that the most accurate approaches often do not meet. Moreover, in case of age estimation, there is the lack of a large and reliably annotated dataset for training deep neural networks. Within this context, we propose in this paper an effective training procedure of CNNs for age estimation based on knowledge distillation, able to allow smaller and simpler "student" models to be trained to match the predictions of a larger "teacher" model. We experimentally show that such student models are able to almost reach the performance of the teacher, obtaining high accuracy over the LFW+, LAP 2016 and Adience datasets, but being up to 15 times faster. Furthermore, we evaluate the performance of the student models in the presence of image corruptions, and we demonstrate that some of them are even more resilient to these corruptions than the teacher model. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
34
Issue :
24
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
160074216
Full Text :
https://doi.org/10.1007/s00521-021-05981-0