Back to Search Start Over

Multimodal Sensor Data Fusion for Activity Recognition Using Filtered Classifier

Authors :
Muhammad Asif Razzaq
Ian Cleland
Chris Nugent
Sungyoung Lee
Source :
Proceedings, Vol 2, Iss 19, p 1262 (2018)
Publication Year :
2018
Publisher :
MDPI AG, 2018.

Abstract

Activity recognition (AR) is a subtask in pervasive computing and context-aware systems, which presents the physical state of human in real-time. These systems offer a new dimension to the widely spread applications by fusing recognized activities obtained from the raw sensory data generated by the obtrusive as well as unobtrusive revolutionary digital technologies. In recent years, an exponential growth has been observed for AR technologies and much literature exists focusing on applying machine learning algorithms on obtrusive single modality sensor devices. However, University of Jaén Ambient Intelligence (UJAmI), a Smart Lab in Spain has initiated a 1st UCAmI Cup challenge by sharing aforementioned varieties of the sensory data in order to recognize the human activities in the smart environment. This paper presents the fusion, both at the feature level and decision level for multimodal sensors by preprocessing and predicting the activities within the context of training and test datasets. Though it achieves 94% accuracy for training data and 47% accuracy for test data. However, this study further evaluates post-confusion matrix also and draws a conclusion for various discrepancies such as imbalanced class distribution within the training and test dataset. Additionally, this study also highlights challenges associated with the datasets for which, could improve further analysis.

Details

Language :
English
ISSN :
25043900
Volume :
2
Issue :
19
Database :
Directory of Open Access Journals
Journal :
Proceedings
Publication Type :
Academic Journal
Accession number :
edsdoj.43d58d7300184b39b931b594acea4df6
Document Type :
article
Full Text :
https://doi.org/10.3390/proceedings2191262