
Thesis Format
Integrated Article
Degree
Doctor of Philosophy
Program
Biomedical Engineering
Supervisor
Peters, Terry M.
2nd Supervisor
Chen, Elvis C.S.
Joint Supervisor
3rd Supervisor
Paul, Narinder
Affiliation
London Health Sciences Centre
Joint Supervisor
Abstract
The groundbreaking advancements in medical imaging technology throughout the 20th century have laid the groundwork for the field of interventional radiology, where a wide range of minimally-invasive procedures are performed. Despite their benefits over open surgery, these procedures impose significant cognitive demands due to the mental mapping required between 2D imaging (e.g., ultrasound, fluoroscopy) and the 3D patient and tools. Additionally, the optimal positioning of displays is constrained by physical equipment, leading to a large visual-motor field disparity.
Augmented reality surgical navigation systems have recently gained attention as a means to address these challenges, particularly with the emergence of optical see-through head-mounted displays, allowing clinicians to retain a natural view of the operating environment. However, technical challenges in calibration, tool tracking, depth perception, and workflow integration have thus far limited their widespread clinical adoption.
This thesis develops and validates a comprehensive AR infrastructure tailored towards the HoloLens 2. First, virtual displays are introduced to minimize visual-motor field disparity without a significant change to the current workflow, improving clinical feasibility. A user study demonstrates a preference for virtual displays among novices, with no significant impact on procedure time or accuracy.
Next, tracking accuracy is assessed, revealing that monoscopic vision-based tracking falls short of interventional radiology requirements due to depth errors, line-of-sight constraints, and the difficulty of rigidly attaching optical markers to surgical tools. To address this, a novel hand-eye calibration method for the HoloLens 2 is developed, enabling seamless integration of magnetic tracking into the AR system. This solution resolves prior limitations and achieves sub-2 mm accuracy in under a minute—suitable for clinical deployment.
Finally, the thesis tackles depth perception challenges in medical AR. Because occlusion is a critical depth cue, a "black hole" projector is introduced to effectively remove real-world light, enhancing the illusion of virtual objects beneath the patient’s surface. A user study confirms improved needle targeting accuracy and reduced mental demand under this approach compared to standard surgical lighting.
This work advances AR integration in interventional radiology by addressing core challenges in visualization, tracking, and clinical feasibility, paving the way for future adoption in surgical navigation.
Summary for Lay Audience
Minimally invasive surgeries, like those performed in Interventional radiology, rely on live medical images (such as ultrasound and X-rays) to guide tools inside the body. However, physicians must mentally translate these flat, 2D images into a 3D understanding of the patient and their instruments, which can be mentally taxing. Additionally, the placement of screens in the operating room is often inconvenient, forcing doctors to look away from their hands while working. This research explores how Augmented reality can help by displaying medical images directly in a surgeon’s field of view using a Head-mounted display. First, a system of virtual screens is developed, reducing the need to glance at distant monitors while maintaining a familiar workflow. A user study found that novice users preferred this setup without sacrificing accuracy or speed. Next, the research tackles a major technical challenge—tracking the precise position of tools in 3D space. Experiments show that common camera-based tracking methods don’t meet the accuracy required for these procedures. To fix this, a new calibration method for the Microsoft HoloLens 2 is introduced, allowing precise tracking of surgical tools in under a minute. Finally, the study addresses a key issue in medical AR: depth perception. Since the human brain relies on object occlusion to judge depth, this research introduces a “black hole” projector that darkens areas of the real world to enhance the illusion that AR images are inside the patient’s body. Experiments show this approach improves accuracy in needle insertion procedures and helps users better judge depth. By solving these visualization, tracking, and depth perception challenges, this work brings AR a step closer to practical use in surgery, making procedures safer and more intuitive for doctors.
Recommended Citation
Allen, Daniel R., "Towards Integration of Augmented Reality into the Interventional Radiology Suite" (2025). Electronic Thesis and Dissertation Repository. 10902.
https://ir.lib.uwo.ca/etd/10902
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.
Gesture Tutorial Application from Chapter 2
VirtualDisplay.mp4 (16802 kB)
Virtual Display Scene from Chapter 2
BlackHoleDemo.mp4 (15594 kB)
Black Hole Needle targeting demo from Chapter 5