DS0704 -

Deep Learning in Multi-view and Multi-modal Surgical Videos for Improved Operating Room Management – DeepSurg

Submission summary

The operating room (OR) is one of the busiest and most information-dense units in the hospital. In this environment, the presence of digital medical equipment and cameras permits the collection of vast volumes of data concerning surgical activities that can be exploited to improve surgical workflows. The objective of DeepSurg is to harness in an interdisciplinary manner the power of computer vision and machine learning to analyze, monitor and improve surgical workflows non-intrusively.

Because the OR is a challenging environment for computer vision, for instance due to constraints on camera positioning, strong illumination changes and occlusions, we will capture the surgical activities using multiple RGB-D cameras to benefit from the complementarity between the color and depth information. We recently equipped two rooms in the interventional radiology department of the Strasbourg University Hospital, our clinical partner, with such a multi-camera system. In DeepSurg, we will then develop methods suitable to the OR environment to detect the persons present in the room, identify their roles and recognize the ongoing surgical activities. As it is particularly difficult to handcraft low-level visual features for multi-view multi-modal data, we propose in this proposal to develop novel methods relying on deep learning to perform the aforementioned clinical detection and recognition tasks. We will do so using a large and unique dataset of multi-view RGB-D videos recorded during real procedures.

We will demonstrate our approach on an unmet clinical need, namely the objective analysis of the radiation exposure of clinicians and staff. Various studies, such as that of the ORAMED European consortium, have shown the need for a better understanding of radiation exposure during X-ray guided procedures in order to improve surgical workflows, develop better protective measures and thereby reduce the risks for clinicians’ health. For this, we will use radiation dose information recorded synchronously with the videos using wireless dosimeters worn by the surgical staff and located at key locations in the room. This work will permit us to annotate the surgical videos with radiation exposure information linked to staff’s positioning and to the surgical steps. These annotations will allow us to develop an intuitive interface for the clinical staff to easily browse the videos for post-operative review. Such review can enable clinical staff and hospital radiation protection officers to revisit the surgical workflow, analyze it, and make improvements to minimize radiation exposure. Furthermore, for the first time, DeepSurg will enable the computation of statistics correlating radiation measurements to the underlying surgical activities, thus permitting the design of safer surgical workflows.

The investigator of this project has established successful collaborations with local clinical partners, which have resulted in the permanent equipment of several operating rooms with a multi-camera RGB-D recording system. The availability of such setups in real surgical facilities is a rare opportunity. This funding application for a young investigator research award aims at giving him the possibility to build upon his previous research and on the clinical infrastructure he has set up to develop and demonstrate the benefits of computational tools for the automated analysis of surgical workflows.

Project coordinator

Monsieur Nicolas Padoy (Laboratoire des sciences de l'ingénieur, de l'informatique et de l'imagerie - Université de Strasbourg)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

ICube - Unistra Laboratoire des sciences de l'ingénieur, de l'informatique et de l'imagerie - Université de Strasbourg

Help of the ANR 277,560 euros
Beginning and duration of the scientific project: January 2017 - 42 Months

Useful links