Back to Search Start Over

Video labelling robot-assisted radical prostatectomy and the role of artificial intelligence (AI): training a novice.

Authors :
Cheikh Youssef, Samy
Hachach-Haram, Nadine
Aydin, Abdullatif
Shah, Taimur T.
Sapre, Nikhil
Nair, Rajesh
Rai, Sonpreet
Dasgupta, Prokar
Source :
Journal of Robotic Surgery; Apr2023, Vol. 17 Issue 2, p695-701, 7p
Publication Year :
2023

Abstract

Video labelling is the assigning of meaningful information to raw videos. With the evolution of artificial intelligence and its intended incorporation into the operating room, video datasets can be invaluable tools for education and the training of intelligent surgical workflow systems through computer vision. However, the process of manual labelling of video datasets can prove costly and time-consuming for already busy practising surgeons. Twenty-five robot-assisted radical prostatectomy (RARP) procedures were recorded on Proximie, an augmented reality platform, anonymised and access given to a novice, who was trained to develop the knowledge and skills needed to accurately segment a full-length RARP procedure on a video labelling platform. A labelled video was subsequently randomly selected for assessment of accuracy by four practising urologists. Of the 25 videos allocated, 17 were deemed suitable for labelling, and 8 were excluded on the basis of procedure length and video quality. The labelled video selected for assessment was graded for accuracy of temporal labelling, with an average score of 93.1%, and a range of 85.6–100%. The self-training of a novice in the accurate segmentation of a surgical video to the standard of a practising urologist is feasible and practical for the RARP procedure. The assigning of temporal labels on a video labelling platform was also studied and proved feasible throughout the study period. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18632483
Volume :
17
Issue :
2
Database :
Complementary Index
Journal :
Journal of Robotic Surgery
Publication Type :
Academic Journal
Accession number :
162916675
Full Text :
https://doi.org/10.1007/s11701-022-01465-y