Mixed Reality Navigation for Laparoscopic Surgery

Brian Xavier, Franklin King, Ahmed Hosny, David Black, Steve Pieper, Jagadeesan Jayender

The role of mixed reality that combines augmented and virtual reality in the healthcare industry, specifically in modern surgical interventions, has yet to be established. In laparoscopic surgeries, precision navigation with real-time feedback of distances from sensitive structures such as the pulmonary vessels is critical to preventing complications. Combining video-assistance with newer navigational technologies to improve outcomes in a simple, cost-effective approach is a constant challenge.

This study aimed to design and validate a novel mixed reality intra-operative surgical navigation environment using a standard model of laparoscopic surgery. We modified an Oculus Rift with two front-facing cameras to receive images and data from 3D Slicer and conducted trials with a standardized Ethicon TASKit surgical skills trainer.

Participants were enrolled and stratified based on surgical experience including residents, fellows, and attending surgeons. Using the TASKit box trainer, participants were asked to transfer pegs, identify radiolabeled pegs, and precisely navigate through wire structures. Tasks were repeated and incrementally aided with modalities such as 3D volumetric navigation, audio feedback, and mixed reality. A final task randomized and compared the current standard of laparoscopy with CT guidance with the proposed standard of mixed reality incorporating all additional modalities. Metrics such as success rate, task time, error rate, and user kinematics were recorded to assess learning and efficiency.

Conclusions: A mixed reality surgical environment incorporating real-time video-assistance, navigational, and radiologic data with audio feedback has been created for the purpose of better enabling laparoscopic surgical navigation with early validations demonstrating potential use cases.