Back to Search Start Over

Adaptive Short-Temporal Induced Aware Fusion Network for Predicting Attention Regions Like a Driver.

Authors :
Li, Qiang
Liu, Chunsheng
Chang, Faliang
Li, Shuang
Liu, Hui
Liu, Zehao
Source :
IEEE Transactions on Intelligent Transportation Systems; Oct2022, Vol. 23 Issue 10, p18695-18706, 12p
Publication Year :
2022

Abstract

Driver attention prediction can solve the problem of ‘Where should the driver pay attention?’, Most previous methods are designed to predict regional attention with redundant regions. Furthermore, popular spatial-temporal feature extraction networks such as ConvLSTM and 3D-CNN are difficult to achieve real-time. To overcome these difficulties, we propose an Adaptive Short-temporal Induced Aware Fusion Network (ASIAF-Net) for region-level and object-level driver attention prediction. In ASIAF-Net, we design an Attention Related Spatial Feature Encoder (AF-Encoder) and an Induced Aware Fusion Network (IAF-Net) as the main network; with an Association Analysis Cell (AAC), the AF-Encoder makes it possible to effectively capture the relationship information of different objects. Considering most vital visual cues from moving objects, we propose a Self-adaptive Short-temporal Feature Extraction Module (SSFE-Module) to obtain inter-frame motion features. In IAF-Net, a Multi-scale Driver Attention Region Prediction Branch is designed to predict the regional attention, and an Object Saliency Estimation Branch is proposed to fuse the perception results and the regional attention map to estimate the object-level attention. Experiments show that the proposed ASIAF-Net can predict driver’s attention on regions and objects more robustly and precisely than state-of-the-art methods on three datasets, and that it achieves real-time on our ADAS platform. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15249050
Volume :
23
Issue :
10
Database :
Complementary Index
Journal :
IEEE Transactions on Intelligent Transportation Systems
Publication Type :
Academic Journal
Accession number :
160686657
Full Text :
https://doi.org/10.1109/TITS.2022.3165619