Back to Search Start Over

Multiscale Deep Feature Learning for Human Activity Recognition Using Wearable Sensors.

Authors :
Tang, Yin
Zhang, Lei
Min, Fuhong
He, Jun
Source :
IEEE Transactions on Industrial Electronics. Feb2023, Vol. 70 Issue 2, p2106-2116. 11p.
Publication Year :
2023

Abstract

Deep convolutional neural networks (CNNs) achieve state-of-the-art performance in wearable human activity recognition (HAR), which has become a new research trend in ubiquitous computing scenario. Increasing network depth or width can further improve accuracy. However, in order to obtain the optimal HAR performance on mobile platform, it has to consider a reasonable tradeoff between recognition accuracy and resource consumption. Improving the performance of CNNs without increasing memory and computational burden is more beneficial for HAR. In this article, we first propose a new CNN that uses hierarchical-split (HS) idea for a large variety of HAR tasks, which is able to enhance multiscale feature representation ability via capturing a wider range of receptive fields of human activities within one feature layer. Experiments conducted on benchmarks demonstrate that the proposed HS module is an impressive alternative to baseline models with similar model complexity, and can achieve higher recognition performance (e.g., 97.28%, 93.75%, 99.02%, and 79.02% classification accuracies) on UCI-HAR, PAMAP2, WISDM, and UNIMIB-SHAR. Extensive ablation studies are performed to evaluate the effect of the variations of receptive fields on classification performance. Finally, we demonstrate that multiscale receptive fields can help to learn more discriminative features (achieving 94.10% SOTA accuracy) in weakly labeled HAR dataset. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02780046
Volume :
70
Issue :
2
Database :
Academic Search Index
Journal :
IEEE Transactions on Industrial Electronics
Publication Type :
Academic Journal
Accession number :
160652014
Full Text :
https://doi.org/10.1109/TIE.2022.3161812