Back to Search Start Over

Real-time automatic surgical phase recognition in laparoscopic sigmoidectomy using the convolutional neural network-based deep learning approach

Authors :
Hiroki Matsuzaki
Hirohisa Miura
Tatsuya Oda
Masahiko Watanabe
Takahiro Yamanashi
Daichi Kitaguchi
Hiroaki Takano
Yusuke Sugomori
Tsuyoshi Enomoto
Nobuyoshi Takeshita
Masaaki Ito
Seigo Hara
Yohei Owada
Daisuke Sato
Source :
Surgical Endoscopy. 34:4924-4931
Publication Year :
2019
Publisher :
Springer Science and Business Media LLC, 2019.

Abstract

Automatic surgical workflow recognition is a key component for developing the context-aware computer-assisted surgery (CA-CAS) systems. However, automatic surgical phase recognition focused on colorectal surgery has not been reported. We aimed to develop a deep learning model for automatic surgical phase recognition based on laparoscopic sigmoidectomy (Lap-S) videos, which could be used for real-time phase recognition, and to clarify the accuracies of the automatic surgical phase and action recognitions using visual information. The dataset used contained 71 cases of Lap-S. The video data were divided into frame units every 1/30 s as static images. Every Lap-S video was manually divided into 11 surgical phases (Phases 0–10) and manually annotated for each surgical action on every frame. The model was generated based on the training data. Validation of the model was performed on a set of unseen test data. Convolutional neural network (CNN)-based deep learning was also used. The average surgical time was 175 min (± 43 min SD), with the individual surgical phases also showing high variations in the duration between cases. Each surgery started in the first phase (Phase 0) and ended in the last phase (Phase 10), and phase transitions occurred 14 (± 2 SD) times per procedure on an average. The accuracy of the automatic surgical phase recognition was 91.9% and those for the automatic surgical action recognition of extracorporeal action and irrigation were 89.4% and 82.5%, respectively. Moreover, this system could perform real-time automatic surgical phase recognition at 32 fps. The CNN-based deep learning approach enabled the recognition of surgical phases and actions in 71 Lap-S cases based on manually annotated data. This system could perform automatic surgical phase recognition and automatic target surgical action recognition with high accuracy. Moreover, this study showed the feasibility of real-time automatic surgical phase recognition with high frame rate.

Details

ISSN :
14322218 and 09302794
Volume :
34
Database :
OpenAIRE
Journal :
Surgical Endoscopy
Accession number :
edsair.doi.dedup.....b98083d36044818b7a1e6092a2c239a4