IEEE Access (Jan 2024)
Multi-User Surgical Navigation Platform Based on Mixed Reality
Abstract
Extended Reality (XR) technology enables visualization of patient data during surgery, with Augmented Reality (AR) systems offering 2D or 3D navigation. However, current systems face limitations such as lack of depth in 2D navigation, restricted single-user functionality in 3D navigation, and limited interoperability between tracking markers and navigation platforms. Additionally, users cannot interact with virtual objects, reducing operational efficiency and intuitiveness. This study addresses these issues by proposing a multi-user Mixed Reality (MR) guidance platform that provides interactive, real-time feedback and enhances collaboration. The platform maps 3D medical models onto real-world objects using 2D marker calibration, supporting focused ultrasound for transcranial brain therapy. Integration with BrainLab and HoloLens 2 achieved a single-user positioning accuracy of $0.576~\pm ~0.294$ mm, with multi-user alignment errors averaging $2.718~\pm ~1.335$ mm. The system maintained a frame rate of 50-60 FPS, demonstrating robust performance and precision.
Keywords