3D Tracking for Robotic Endoscopic Surgery Using Image Segmentation Maps and Geometric Cues 

Introduction

Cutting-edge technology is revolutionizing surgery with virtual simulation systems that slash risks, reduce training expenses, and elevate surgical proficiency. These systems hinge on pinpointing precise 3D instrument movements, crucial in minimally invasive surgery. Instead of traditional markers, which slow procedures and pose infection risks, researchers are harnessing raw surgical videos and innovative techniques like background subtraction and deep learning. This groundbreaking approach dissects the 3D tracking challenge into 2D tracking, and then transforms it into 3D motion via geometric dynamics. It ensures instruments avoid critical tissue collisions and blend kinematics, interval arithmetic, and geometric analysis for accuracy. This innovation promises real-time training and autonomous instrument tracking, a game-changer for surgery


Key Elements of the Approach:


Rectangular Interval Representation: In each frame of an endoscopic video, surgical instruments are approximated as a series of rectangular intervals. Each interval is described by parameters including its center (x, y), length (l), width (w), and angle (θ).

Connection of Intervals: Instruments are made up of multiple parts, and these intervals for each part are interconnected to reflect the realistic movement of the instrument.

Overlapping Region: The overlapping region between connected intervals represents the actual movement between the various parts of the instrument, creating a linkage effect.

3D Motion Tracking: The system tracks the 3D motion of instruments by monitoring the change in position and orientation of these intervals across frames.

Simulation for Visualization: The 3D motion is visualized through a simulation, allowing surgeons to observe the instrument's movement from various angles.

Benefits and Validation:


Precision: The system achieves accurate instrument tracking with negligible errors, as validated through 2D and 3D error analysis. 94.99 % better accuracy than state-of-the-art methods

Real-time Training: The approach has the potential for real-time surgical training, providing a valuable resource for medical professionals to enhance their skills.

Autonomous Tracking: The system can also be applied to autonomous instrument tracking during surgery, reducing the need for manual control.

Reduced Risks: By eliminating the need for physical markers, the risk of infection and procedural time is significantly reduced.

This innovative approach represents a significant step forward in the field of minimally invasive surgery, promising safer and more precise procedures while also facilitating training and autonomous surgical applications

VICON Experimentation

ROS-Pybullet Simulation