1. Millimeter wave gesture recognition using multi-feature fusion models in complex scenes.
- Author
-
Hao, Zhanjun, Sun, Zhizhou, Li, Fenfang, Wang, Ruidong, and Peng, Jianxiang
- Subjects
MILLIMETER waves ,ARTIFICIAL neural networks ,CONVOLUTIONAL neural networks ,GESTURE ,SIGNAL-to-noise ratio ,BODY language ,STIMULUS generalization ,RISK perception - Abstract
As a form of body language, the gesture plays an important role in smart homes, game interactions, and sign language communication, etc. The gesture recognition methods have been carried out extensively. The existing methods have inherent limitations regarding user experience, visual environment, and recognition granularity. Millimeter wave radar provides an effective method for the problems lie ahead gesture recognition because of the advantage of considerable bandwidth and high precision perception. Interfering factors and the complexity of the model raise an enormous challenge to the practical application of gesture recognition methods as the millimeter wave radar is applied to complex scenes. Based on multi-feature fusion, a gesture recognition method for complex scenes is proposed in this work. We collected data in variety places to improve sample reliability, filtered clutters to improve the signal's signal-to-noise ratio (SNR), and then obtained multi features involves range-time map (RTM), Doppler-time map (DTM) and angle-time map (ATM) and fused them to enhance the richness and expression ability of the features. A lightweight neural network model multi-CNN-LSTM is designed to gestures recognition. This model consists of three convolutional neural network (CNN) for three obtained features and one long short-term memory (LSTM) for temporal features. We analyzed the performance and complexity of the model and verified the effectiveness of feature extraction. Numerous experiments have shown that this method has generalization ability, adaptability, and high robustness in complex scenarios. The recognition accuracy of 14 experimental gestures reached 97.28%. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF