Back to Search Start Over

A Cross-View Gait Recognition Method Using Two-Way Similarity Learning

Authors :
Y. J. Qi
Y. P. Kong
Q. Zhang
Source :
Mathematical Problems in Engineering.
Publication Year :
2022
Publisher :
Hindawi, 2022.

Abstract

Gait recognition is a powerful tool for long-distance identification. However, gaits are influenced by walking environments and appearance changes. Therefore, the gait recognition rate declines sharply when the viewing angle changes. In this work, we propose a novel cross-view gait recognition method with two-way similarity learning. Focusing on the relationships between gait elements in three-dimensional space and the wholeness of human body movements, we design a three-dimensional gait constraint model that is robust to view changes based on joint motion constraint relationships. Different from the classic three-dimensional model, the proposed model characterizes motion constraints and action constraints between joints based on time and space dimensions. Next, we propose an end-to-end two-way gait network using long short-term memory and residual network 50 to extract the temporal and spatial difference features, respectively, of model pairs. The two types of difference features are merged at a high level in the network, and similarity values are obtained through the softmax layer. Our method is evaluated based on the challenging CASIA-B data set in terms of cross-view gait recognition. The experimental results show that the method achieves a higher recognition rate than the previously developed model-based methods. The recognition rate reaches 72.8%, and the viewing angle changes from 36° to 144° for normal walking. Finally, the new method also performs better in cases with large cross-view angles, illustrating that our model is robust to viewing angle changes and that the proposed network offers considerable potential in practical application scenarios.

Details

Language :
English
ISSN :
1024123X
Database :
OpenAIRE
Journal :
Mathematical Problems in Engineering
Accession number :
edsair.doi.dedup.....fa0085c612a36282d1d334205536baf3
Full Text :
https://doi.org/10.1155/2022/2674425