Back to Search Start Over

Local eye-net: An attention based deep learning architecture for localization of eyes.

Authors :
Maiti, Somsukla
Gupta, Akshansh
Source :
Expert Systems with Applications. Apr2024, Vol. 239, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Development of human machine interface has become a necessity for modern day machines to catalyze more autonomy and more efficiency. Gaze driven human intervention is an effective and convenient option for creating an interface to alleviate human errors. Facial landmark detection is very crucial for designing a robust gaze detection system. Regression based methods capacitate good spatial localization of the landmarks corresponding to different parts of the faces. But there are still scope of improvements which have been addressed by incorporating attention. In this paper, we have proposed a deep coarse-to-fine architecture called LocalEyenet for localization of only the eye regions that can be trained end-to-end. The model architecture, build on stacked hourglass backbone, learns the self-attention in feature maps which aids in preserving global as well as local spatial dependencies in face image. We have incorporated deep layer aggregation in each hourglass to minimize the loss of attention over the depth of architecture. Our model shows good generalization ability in cross-dataset evaluation and in real-time localization of eyes. • Deep architecture build on stacked hourglass backbone for localization of eye regions. • Self-attention module between hourglasses to learn local spatial and global attention. • Designed a deep layer aggregation module to learn feature dependencies across network. • Differentiable soft-argmax to make the framework an end-to-end trainable architecture. • Precise and fast localization of eyes tested in real-time with varied illumination. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
239
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
174875339
Full Text :
https://doi.org/10.1016/j.eswa.2023.122416