Abstract
Neurosurgical procedures are associated with great challenges for the surgeon since a high degree of precision is required. Operations are performed within a limited space and often concealed structures are not visible to the surgeon. A system is proposed that integrates augmented reality into a digital operating room. The basis for this is an understanding of the scene and the integration into the surgical workflow. In a first step a two-stage process is implemented to detect the patient on the operating table with high precision. Further a solution is presented to semantically segment the surgical scene to detect and track medical instruments. For better understanding of the situation in the operation room the medical staff is tracked with OpenPose. These solutions build the base for a precise and robust integration of augmented reality into the digital operating room.

This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright (c) 2020 Christian Kunz, Franziska Mathis-Ullrich, Bj¨orn Hein
