Back to Search Start Over

Human teleoperation - a haptically enabled mixed reality system for teleultrasound.

Authors :
Black, David
Oloumi Yazdi, Yas
Hadi Hosseinabadi, Amir Hossein
Salcudean, Septimiu
Source :
Human-Computer Interaction. 2024, Vol. 39 Issue 5/6, p529-552. 24p.
Publication Year :
2024

Abstract

Current teleultrasound methods include audiovisual guidance and robotic teleoperation, which constitute tradeoffs between precision and latency versus flexibility and cost. We present a novel concept of "human teleoperation" which bridges the gap between these two methods. In the concept, an expert remotely teloperates a person (the follower) wearing a mixed-reality headset by controlling a virtual ultrasound probe projected into the person's scene. The follower matches the pose and force of the virtual device with a real probe. The pose, force, video, ultrasound images, and 3-dimensional mesh of the scene are fed back to the expert. This control framework, where the actuation is carried out by people, allows more precision and speed than verbal guidance, yet is more flexible and inexpensive than robotic teleoperation. The purpose of this paper is to introduce this concept as well as a prototype teleultrasound system with limited haptics and local communication. The system was tested to show its potential, including mean teleoperation latencies of 0.32 ± 0.05 seconds and steady-state errors of 4.4 ± 2.8 mm and 5.4 ± 2.8 $^ \circ $ ∘ in position and orientation tracking respectively. A preliminary test with an ultrasonographer and four patients was completed, showing lower measurement error and a completion time of 1:36 ± 0:23 minutes using human teleoperation compared to 4:13 ± 3:58 using audiovisual teleguidance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
07370024
Volume :
39
Issue :
5/6
Database :
Academic Search Index
Journal :
Human-Computer Interaction
Publication Type :
Academic Journal
Accession number :
179415592
Full Text :
https://doi.org/10.1080/07370024.2023.2218355