Back to Search
Start Over
Real-time action recognition by feature-level fusion of depth and inertial sensor
- Source :
- RCAR
- Publication Year :
- 2017
- Publisher :
- IEEE, 2017.
-
Abstract
- Human action recognition has been an active research topic for its wide applications. Most researches focus on the recognition based on one single modality sensor. In this paper, we present a novel approach for human action recognition, which is based on feature-level fusion of depth and inertial sensor. We extract Fast Fourier Transform (FFT) coefficients from acceleration signals and Histograms of Oriented Gradients (HOG) features from Motion Response Maps (MRM). After obtaining these two modality feature vectors, we adopt Discriminant Correlation Analysis (DCA) to learn a fused feature descriptor with better discriminating ability. To evaluate the effectiveness and efficiency of the proposed approach, we conduct experiments on the multimodal human action database CAS-YNU-MHAD. Experimental results demonstrate the fused feature descriptor exhibits a strong and stable performance in improving the recognition accuracy. Moreover, our approach has a low computational complexity and can be employed in real-time systems.
- Subjects :
- Modality (human–computer interaction)
Computational complexity theory
Computer science
business.industry
Feature vector
010401 analytical chemistry
Feature extraction
Fast Fourier transform
ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION
Pattern recognition
02 engineering and technology
01 natural sciences
0104 chemical sciences
Histogram
0202 electrical engineering, electronic engineering, information engineering
Feature (machine learning)
020201 artificial intelligence & image processing
Artificial intelligence
business
Focus (optics)
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2017 IEEE International Conference on Real-time Computing and Robotics (RCAR)
- Accession number :
- edsair.doi...........5f634ffc5819672ae10de9544d703df0
- Full Text :
- https://doi.org/10.1109/rcar.2017.8311844