1. Learning-Based Motion-Intention Prediction for End-Point Control of Upper-Limb-Assistive Robots
- Author
-
Sibo Yang, Neha P. Garg, Ruobin Gao, Meng Yuan, Bernardo Noronha, Wei Tech Ang, Dino Accoto, School of Mechanical and Aerospace Engineering, School of Computer Science and Engineering, and Rehabilitation Research Institute of Singapore (RRIS)
- Subjects
Wearable Sensors ,Mechanical engineering [Engineering] ,Computer science and engineering [Engineering] ,Electrical and Electronic Engineering ,Upper Limb Assistive Robots ,Biochemistry ,Instrumentation ,upper limb assistive robots ,wearable sensors ,motion intention detection ,human–robot interaction ,sensory fusion ,machine learning ,Atomic and Molecular Physics, and Optics ,Analytical Chemistry - Abstract
The lack of intuitive and active human-robot interaction makes it difficult to use upper-limb-assistive devices. In this paper, we propose a novel learning-based controller that intuitively uses onset motion to predict the desired end-point position for an assistive robot. A multi-modal sensing system comprising inertial measurement units (IMUs), electromyographic (EMG) sensors, and mechanomyography (MMG) sensors was implemented. This system was used to acquire kinematic and physiological signals during reaching and placing tasks performed by five healthy subjects. The onset motion data of each motion trial were extracted to input into traditional regression models and deep learning models for training and testing. The models can predict the position of the hand in planar space, which is the reference position for low-level position controllers. The results show that using IMU sensor with the proposed prediction model is sufficient for motion intention detection, which can provide almost the same prediction performance compared with adding EMG or MMG. Additionally, recurrent neural network (RNN)-based models can predict target positions over a short onset time window for reaching motions and are suitable for predicting targets over a longer horizon for placing tasks. This study's detailed analysis can improve the usability of the assistive/rehabilitation robots. Agency for Science, Technology and Research (A*STAR) Published version This work was partially supported by the grant “Intelligent Human–robot interface for upper limb wearable robots” (Award Number SERC1922500046, A*STAR, Singapore).
- Published
- 2023
- Full Text
- View/download PDF