Barber, Samuel R1, Watson, Geoff E., Dewyer, Nicholas2 , Remenschneider, Aaron K3, Lee, Daniel J2, Kozin, Elliott D.2
1 Department of Otolaryngology-Head and Neck Surgery, University of Arizona College of Medicine, Tucson, AZ
2 Department of Otolaryngology-Head and Neck Surgery, MEEI, Boston, MA
3 Eaton Peabody Laboratories, MEEI, Boston, MA
Introduction:
Augmented reality (AR) applications provide the ability to add visual cues to existing environments. AR apps may provide an ideal platform for preoperative planning in transcanal endoscopic ear surgery (TEES). Herein, we aim to 1) Use AR to enhance physical anatomic models by identifying critical landmarks and 2) Demonstrate evidence that AR could be used for next generation lateral skull base navigation.
Methods:
Cadaveric temporal bones (n=6) underwent high-resolution CT imaging (0.625mm slice thickness). Soft tissue and bone were segmented and imported into the Unity game engine. The bone model was also 3D-printed. An AR app was designed that showed a live endoscope feed on a computer display. The 3D-printed model could be visualized endoscopically within the game engine, and anatomy including the jugular bulb, internal carotid artery, facial nerve, and cochlea were virtually represented with superimposition of structures over live video. A TEES reformatted CT was integrated with the heads up display (HUD) for review.
Results:
An AR app was successfully built on a desktop PC that displayed the virtual anatomic structures overtop physical models. Nerves and vessels were visualized from multiple angles. Additionally, the CT imaging coordinates corresponding to the position of a tracked controller was updated in real-time and displayed the correct reformatted slice.
Conclusion:
A desktop AR app projecting anatomic landmarks is feasible. With improved tracking technology, AR can be utilized to predict anatomic constraints during pre-operative planning. This also provides proof of concept to suggest that AR could be used in live surgery.