1. Surgical tool classification and localization: results and methods from the MICCAI 2022 SurgToolLoc challenge
- Author
-
Zia, Aneeq, Bhattacharyya, Kiran, Liu, Xi, Berniker, Max, Wang, Ziheng, Nespolo, Rogerio, Kondo, Satoshi, Kasai, Satoshi, Hirasawa, Kousuke, Liu, Bo, Austin, David, Wang, Yiheng, Futrega, Michal, Puget, Jean-Francois, Li, Zhenqiang, Sato, Yoichi, Fujii, Ryo, Hachiuma, Ryo, Masuda, Mana, Saito, Hideo, Wang, An, Xu, Mengya, Islam, Mobarakol, Bai, Long, Pang, Winnie, Ren, Hongliang, Nwoye, Chinedu, Sestini, Luca, Padoy, Nicolas, Nielsen, Maximilian, Schüttler, Samuel, Sentker, Thilo, Husseini, Hümeyra, Baltruschat, Ivo, Schmitz, Rüdiger, Werner, René, Matsun, Aleksandr, Farooq, Mugariya, Saaed, Numan, Viera, Jose Renato Restom, Yaqub, Mohammad, Getty, Neil, Xia, Fangfang, Zhao, Zixuan, Duan, Xiaotian, Yao, Xing, Lou, Ange, Yang, Hao, Han, Jintong, Noble, Jack, Wu, Jie Ying, Alshirbaji, Tamer Abdulbaki, Jalal, Nour Aldeen, Arabian, Herag, Ding, Ning, Moeller, Knut, Chen, Weiliang, He, Quan, Bilal, Muhammad, Akinosho, Taofeek, Qayyum, Adnan, Caputo, Massimo, Vohra, Hunaid, Loizou, Michael, Ajayi, Anuoluwapo, Berrou, Ilhem, Niyi-Odumosu, Faatihah, Maier-Hein, Lena, Stoyanov, Danail, Speidel, Stefanie, and Jarc, Anthony
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
The ability to automatically detect and track surgical instruments in endoscopic videos can enable transformational interventions. Assessing surgical performance and efficiency, identifying skilled tool use and choreography, and planning operational and logistical aspects of OR resources are just a few of the applications that could benefit. Unfortunately, obtaining the annotations needed to train machine learning models to identify and localize surgical tools is a difficult task. Annotating bounding boxes frame-by-frame is tedious and time-consuming, yet large amounts of data with a wide variety of surgical tools and surgeries must be captured for robust training. Moreover, ongoing annotator training is needed to stay up to date with surgical instrument innovation. In robotic-assisted surgery, however, potentially informative data like timestamps of instrument installation and removal can be programmatically harvested. The ability to rely on tool installation data alone would significantly reduce the workload to train robust tool-tracking models. With this motivation in mind we invited the surgical data science community to participate in the challenge, SurgToolLoc 2022. The goal was to leverage tool presence data as weak labels for machine learning models trained to detect tools and localize them in video frames with bounding boxes. We present the results of this challenge along with many of the team's efforts. We conclude by discussing these results in the broader context of machine learning and surgical data science. The training data used for this challenge consisting of 24,695 video clips with tool presence labels is also being released publicly and can be accessed at https://console.cloud.google.com/storage/browser/isi-surgtoolloc-2022.
- Published
- 2023