Politecnico di Torino - Corso Duca degli Abruzzi, 24 - 10129 Torino, ITALY

+39 011 090 6100 info@tech-share.it

Automatic Tracking System For Organs In Robotic Surgery

3D Model RegistrationAugmented Realityminimally invasive surgeryneural networksReti Neurali

Introduction

Through the analysis of the endoscope video stream of a generic surgical robot, the methodology object of this patent can overlay the 3D model of a generic organ to its 2D counterpart represented in the video stream, through the values of position and rotation, and then continue with the tracking in real time.

2020-053_Img1

Technical features

The proposed method uses Augmented Reality (AR) and Artificial Intelligence (AI) technologies as a navigation aid in minimally invasive robotic surgery (MIS), where the view of the surgery is made possible through an endoscopic camera. Increasing the information flow available to the surgeon through AR techniques is the aim of the methodology of this patent proposal. The video stream is processed through a first neural network in order to localize the different elements within the scene. The organ is then isolated, and a second neural network takes care of rotation prediction. A 3D model of the patient-specific organ (modelled via pre-operative imaging with graphical modeling software) is then projected via AR technologies in order to provide assistance to the surgeon during surgery. Next, a third tracking neural network will assess the displacements of the camera and adapt the position and rotation of the organ accordingly.

Possible Applications

The application can be used in the medical field as an intra-operative navigation support during robotic surgery.

Advantages

  • Fast identification of surgery targets and critical structures to be excluded from manipulation;
  • Prevention of the surgeon disorientation by increasing his spatial awareness and resulting in less time spent in the operating room;
  • Reduction in the cognitive load of surgeons during surgery, no longer having to mentally match information from different sources to the scene.