Back to Search
Start Over
Vision System with Personnel Recognition and Tracking Functions for Use in a Space with Multiple Persons.
- Source :
- Sensors & Materials; May2024, Vol. 36 Issue 5, Part 2, p1881-1903, 23p
- Publication Year :
- 2024
-
Abstract
- In this study, we propose a vision system for personnel recognition and tracking in a multiple-persons space. The personnel tracking function can be implemented in service vehicles or medical systems to identify specific users, thus facilitating service tasks such as follow-up care or support. User identification and tracking processes become highly challenging in environments that contain multiple people who are not wearing any external sensors. The proposed system involves three steps. First, the You Only Look Once version 4 (YOLOv4)-tiny model of object detection is used to extract a personnel-bounding box from an image. Second, the image coordinates and camera coordinates are converted into three-dimensional (3D) space coordinates on the basis of a depth map derived from a binocular camera to obtain 3D spatial information on the personnel. Finally, the unscented Kalman filter (UKF), FaceNet, or a combination of the UKF and FaceNet (UKF--FaceNet) produced through decision tree fusion, is used to identify or track the target personnel. Experimental results indicated that the proposed system can successfully track personnel. The UKF--FaceNet method can effectively improve the drawbacks of UKF- and FaceNet-based tracking. The UKF--FaceNet method can be adopted to achieve accurate and stable personnel identification in a multiple-persons environment. This method can reidentify and retrack a target user accurately when the user is obscured. [ABSTRACT FROM AUTHOR]
- Subjects :
- FUNCTION spaces
KALMAN filtering
DECISION trees
ARTIFICIAL satellite tracking
Subjects
Details
- Language :
- English
- ISSN :
- 09144935
- Volume :
- 36
- Issue :
- 5, Part 2
- Database :
- Complementary Index
- Journal :
- Sensors & Materials
- Publication Type :
- Academic Journal
- Accession number :
- 177593481
- Full Text :
- https://doi.org/10.18494/SAM4527