Back to Search
Start Over
Deep Motion Prior for Weakly-Supervised Temporal Action Localization.
- Source :
-
IEEE Transactions on Image Processing . 2022, Vol. 31, p5203-5213. 11p. - Publication Year :
- 2022
-
Abstract
- Weakly-Supervised Temporal Action Localization (WSTAL) aims to localize actions in untrimmed videos with only video-level labels. Currently, most state-of-the-art WSTAL methods follow a Multi-Instance Learning (MIL) pipeline: producing snippet-level predictions first and then aggregating to the video-level prediction. However, we argue that existing methods have overlooked two important drawbacks: 1) inadequate use of motion information and 2) the incompatibility of prevailing cross-entropy training loss. In this paper, we analyze that the motion cues behind the optical flow features are complementary informative. Inspired by this, we propose to build a context-dependent motion prior, termed as motionness. Specifically, a motion graph is introduced to model motionness based on the local motion carrier (e.g., optical flow). In addition, to highlight more informative video snippets, a motion-guided loss is proposed to modulate the network training conditioned on motionness scores. Extensive ablation studies confirm that motionness efficaciously models action-of-interest, and the motion-guided loss leads to more accurate results. Besides, our motion-guided loss is a plug-and-play loss function and is applicable with existing WSTAL methods. Without loss of generality, based on the standard MIL pipeline, our method achieves new state-of-the-art performance on three challenging benchmarks, including THUMOS’14, ActivityNet v1.2 and v1.3. [ABSTRACT FROM AUTHOR]
- Subjects :
- *OPTICAL flow
*MOTION
*ADAPTIVE optics
*OPTICAL losses
*FEATURE extraction
Subjects
Details
- Language :
- English
- ISSN :
- 10577149
- Volume :
- 31
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Image Processing
- Publication Type :
- Academic Journal
- Accession number :
- 170077329
- Full Text :
- https://doi.org/10.1109/TIP.2022.3193752