Back to Search Start Over

EndoNet: A Deep Architecture for Recognition Tasks on Laparoscopic Videos.

Authors :
Twinanda, Andru P.
Shehata, Sherif
Mutter, Didier
Marescaux, Jacques
de Mathelin, Michel
Padoy, Nicolas
Source :
IEEE Transactions on Medical Imaging. Jan2017, Vol. 36 Issue 1, p86-97. 12p.
Publication Year :
2017

Abstract

Surgical workflow recognition has numerous potential medical applications, such as the automatic indexing of surgical video databases and the optimization of real-time operating room scheduling, among others. As a result, surgical phase recognition has been studied in the context of several kinds of surgeries, such as cataract, neurological, and laparoscopic surgeries. In the literature, two types of features are typically used to perform this task: visual features and tool usage signals. However, the used visual features are mostly handcrafted. Furthermore, the tool usage signals are usually collected via a manual annotation process or by using additional equipment. In this paper, we propose a novel method for phase recognition that uses a convolutional neural network (CNN) to automatically learn features from cholecystectomy videos and that relies uniquely on visual information. In previous studies, it has been shown that the tool usage signals can provide valuable information in performing the phase recognition task. Thus, we present a novel CNN architecture, called EndoNet, that is designed to carry out the phase recognition and tool presence detection tasks in a multi-task manner. To the best of our knowledge, this is the first work proposing to use a CNN for multiple recognition tasks on laparoscopic videos. Experimental comparisons to other methods show that EndoNet yields state-of-the-art results for both tasks. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
02780062
Volume :
36
Issue :
1
Database :
Academic Search Index
Journal :
IEEE Transactions on Medical Imaging
Publication Type :
Academic Journal
Accession number :
120574782
Full Text :
https://doi.org/10.1109/TMI.2016.2593957