Back to Search
Start Over
Joint Detection and Association for End-to-End Multi-object Tracking.
- Source :
- Neural Processing Letters; Dec2023, Vol. 55 Issue 9, p11823-11844, 22p
- Publication Year :
- 2023
-
Abstract
- Multi-object tracking (MOT) is mainly used for detecting and tracking the object on multi-cameras, which is widely applied in intelligent video surveillance and intelligent security. The process of MOT generally involves three import parts: feature extracting, multi-task learning and object matching. Unfortunately, the existed methods still have some drawbacks. Firstly, the feature extracting module cannot effectively fuse the shallow and deep features. What's more, the multi-task learning module cannot strike a good balance between the detection and re-identification. In addition, the object matching module associates with pedestrian by using a traditional method rather than training a model. For these problems, we propose a method of joint detection and association (JDA) for end-to-end multi-object tracking network, which involves the multi-scale feature extraction and the learnable object association. It first combines a feature extraction backbone based on multi-scale feature fusion and a point-based multi-task object detection branch, to solve the task of feature extraction and object detection. Then, a learnable object motion association module is embedded, which uses the historical frames information to infer the position of the object, and associate the object identity between previous frames and subsequent frames. In addition, the JDA can be end-to-end trained when handling the detection and matching tasks. The proposed JDA is evaluated through a series of experiments on MOT16 and MOT17. The results shows that JDA the existing methods in terms of precision and stability of MOT. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 13704621
- Volume :
- 55
- Issue :
- 9
- Database :
- Complementary Index
- Journal :
- Neural Processing Letters
- Publication Type :
- Academic Journal
- Accession number :
- 174473522
- Full Text :
- https://doi.org/10.1007/s11063-023-11397-9