1. Probabilistic Cascading Classifier for Energy-Efficient Activity Monitoring in Wearables.
- Author
-
Pedram, Mahdi, Sah, Ramesh Kumar, Rokni, Seyed Ali, Nourollahi, Marjan, and Ghasemzadeh, Hassan
- Abstract
Advances in embedded systems have given rise to integrating several small-size health monitoring devices within daily human life. This trend led to an ongoing extension of wearable sensors in a broad range of applications. Wearable technologies, which are firmly connected with the human body, utilize sensors and machine learning to describe individuals’ physical or psychological routines through activity recognition and human movement. Since wearables are used all day long, the power consumption of these systems needs to be reasonably low. Current research considers that such machine learning methods are trained with fixed properties, including sensor sampling rate and statistical features computed from the time series data. However, in reality, wearables require continuous reconfiguration of their computational algorithms due to the personalized nature of human gait and movement. Furthermore, computational algorithms must become energy- and memory-efficient due to these embedded sensors’ limited power and memory. In this paper, we propose a resource-efficient framework for real-time, continuous, and on-node human activity recognition. Typically activity recognition problem is a multi-class classification problem. However, we suggest transforming this problem based on MET (Metabolic Equivalent of Task) into a hierarchical classification model, providing personalized structure for each individual. We discuss the design and construction of this new configurable classification paradigm. Our results demonstrate that the proposed probabilistic cascading system accuracy for different personalized scenarios varies between 94.5% and 96.9% in detecting activities using a limited memory, while power usage of the system is reduced by as high as 17.2% compared to the traditional methods. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF