1. iMouse: Augmentative Communication with Patients Having Neuro-Locomotor Disabilities Using Simplified Morse Code
- Author
-
Cho, Hyeonseok Kim, Seungjae Han, and Jeongho
- Subjects
ALS ,eyeblink detection ,eye detection ,user interface ,quadrant navigation - Abstract
Patients with amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease, an incurable disease in which motor neurons are selectively destroyed, gradually lose their mobility as organ dysfunction begins, and eventually, patients find it challenging to make even minor movements and simple communications. To communicate with patients with quadriplegia, researchers have focused on movements of the eye, the only moving organ for patients with ALS, and they have investigated the detection of eyeblinks using brainwaves or cameras or other ways to select letters on a screen via eyeball movements based on eye-tracking cameras. However, brainwave-based techniques, which use the electrical signals of eye movements to determine patient’s intentions, are sensitive to noise, often resulting in the inaccurate identification of intent. Alternatively, a camera-based method that uses letter selection detects the movement of eye feature-points, and this method makes it easy to identify a patient’s intentions using a predefined decision-making process. However, it has long processing time and is prone to inaccuracy due to errors in either the Morse code implementation assigned to all alphabets or the sequential selection methods. Therefore, we have proposed iMouse-sMc, a simplified Morse code-based user interface model using an eye mouse for faster and easier communication with such patients. Furthermore, we improved the detection performance of the eye mouse by applying image contrast techniques to enable communication with patients even at night. To verify the excellent performance of the proposed eye mouse for a user interface, we conducted comparative experiments with existing camera-based communication models based on various words. The results revealed that the time of communication was reduced to 83 s and the intention recognition accuracy was improved by ~28.16%. Additionally, even in low-light environments, where existing models are unable to communicate with the patients due to difficulties with eye detection, the proposed model demonstrated its eye detection capability and proved that it can be used universally for communication with patients during the day and at night.
- Published
- 2023
- Full Text
- View/download PDF