Back to Search Start Over

Progression-Guided Temporal Action Detection in Videos

Authors :
Lu, Chongkai
Mak, Man-Wai
Li, Ruimin
Chi, Zheru
Fu, Hong
Lu, Chongkai
Mak, Man-Wai
Li, Ruimin
Chi, Zheru
Fu, Hong
Publication Year :
2023

Abstract

We present a novel framework, Action Progression Network (APN), for temporal action detection (TAD) in videos. The framework locates actions in videos by detecting the action evolution process. To encode the action evolution, we quantify a complete action process into 101 ordered stages (0\%, 1\%, ..., 100\%), referred to as action progressions. We then train a neural network to recognize the action progressions. The framework detects action boundaries by detecting complete action processes in the videos, e.g., a video segment with detected action progressions closely follow the sequence 0\%, 1\%, ..., 100\%. The framework offers three major advantages: (1) Our neural networks are trained end-to-end, contrasting conventional methods that optimize modules separately; (2) The APN is trained using action frames exclusively, enabling models to be trained on action classification datasets and robust to videos with temporal background styles differing from those in training; (3) Our framework effectively avoids detecting incomplete actions and excels in detecting long-lasting actions due to the fine-grained and explicit encoding of the temporal structure of actions. Leveraging these advantages, the APN achieves competitive performance and significantly surpasses its counterparts in detecting long-lasting actions. With an IoU threshold of 0.5, the APN achieves a mean Average Precision (mAP) of 58.3\% on the THUMOS14 dataset and 98.9\% mAP on the DFMAD70 dataset.<br />Comment: Under Review. Code available at https://github.com/makecent/APN

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438472470
Document Type :
Electronic Resource