Back to Search Start Over

Data-driven Holistic Framework for Automated Laparoscope Optimal View Control with Learning-based Depth Perception

Authors :
Li, Bin
Lu, Bo
Lu, Yiang
Dou, Qi
Liu, Yun-Hui
Li, Bin
Lu, Bo
Lu, Yiang
Dou, Qi
Liu, Yun-Hui
Publication Year :
2020

Abstract

Laparoscopic Field of View (FOV) control is one of the most fundamental and important components in Minimally Invasive Surgery (MIS), nevertheless, the traditional manual holding paradigm may easily bring fatigue to surgical assistants, and misunderstanding between surgeons also hinders assistants to provide a high-quality FOV. Targeting this problem, we here present a data-driven framework to realize an automated laparoscopic optimal FOV control. To achieve this goal, we offline learn a motion strategy of laparoscope relative to the surgeon's hand-held surgical tool from our in-house surgical videos, developing our control domain knowledge and an optimal view generator. To adjust the laparoscope online, we first adopt a learning-based method to segment the two-dimensional (2D) position of the surgical tool, and further leverage this outcome to obtain its scale-aware depth from dense depth estimation results calculated by our novel unsupervised RoboDepth model only with the monocular camera feedback, hence in return fusing the above real-time 3D position into our control loop. To eliminate the misorientation of FOV caused by Remote Center of Motion (RCM) constraints when moving the laparoscope, we propose a novel distortion constraint using an affine map to minimize the visual warping problem, and a null-space controller is also embedded into the framework to optimize all types of errors in a unified and decoupled manner. Experiments are conducted using Universal Robot (UR) and Karl Storz Laparoscope/Instruments, which prove the feasibility of our domain knowledge and learning enabled framework for automated camera control.<br />Comment: 7 pages, 7 figures, 2021 IEEE International Conference on Robotics and Automation (ICRA)

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1228447782
Document Type :
Electronic Resource