Search for a command to run...
During tumor resections, surgeons must integrate visual cues from the operative field with feedback from pathology to evaluate resection margins. This process requires constructing a complex mental spatial map of the tumor bed and the ability to integrate pathological analysis back for additional resections. Unfortunately, this can lead to incomplete resections and result in positive margins. Such outcomes often necessitate additional surgeries, increasing patient risk, anxiety, and overall healthcare burden. To address this challenge, we are investigating a mixed reality (MR) system that transforms pathology-confirmed positive margins directly to locations within the surgical field. A critical requirement of this system is accurate alignment of pathology data with the intraoperative presentation of the tumor bed. We present and evaluate a rigid registration workflow that employs surgical clips placed on both resection specimens and corresponding tumor bed sites as corresponding fiducial markers. Data acquisition was performed using the Microsoft HoloLens® 2 enabled to track a conventional stylus fitted with conventional retroreflective spheres. Voice commands were incorporated to streamline point collection and registration. System point acquisition accuracy and precision were assessed using a machined phantom containing nine fiducial points, each with a known ground truth and positioned throughout the field of view. Precision was measured from the point distribution of repetitive acquisition, while accuracy was evaluated by registering collected data to ground-truth coordinates. To investigate our MR approach within the desired clinical context, a second soft tissue-like breast phantom was created to simulate a mock resection. Conventional surgical clips used in surgical practice were used as fiducial markers, and target points within the mock resection site were created using a process that ensured known correspondence between the mock specimen and the tumor-bed. Ground truth comparisons were independently captured using a second optical tracker. Following rigid registration, specimen targets were projected into the phantom cavity using the HoloLens® 2 and compared against the ground truth. The point acquisition precision yielded a fiducial localization error (FLE) of 0.7 ± 0.2 mm. When registering to the ground-truth machined phantom, the fiducial registration error (FRE) was 1.8 ± 0.6 mm and the target registration error (TRE) was 2.6 ± 0.9 mm. With respect to the mock clinical scenario, the deep-margin gold-standard alignment error using the conventional optical tracker yielded an FRE of 2.8±1.0 mm with a mean TRE of 3.3±1.7 mm. When performing the same experiment using HoloLens® 2 internal tracking, the composite mean FRE increased to 3.8±0.8 mm and the composite mean TRE increased to 5.3±1.1mm. Overall, the fidelity of the results is encouraging, but they also suggest that there is a degradation when using a head-mounted display for localization rather than the conventional optical tracking devices.
DOI: 10.1117/12.3087843