Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process

Nassir Navab, Alejandro Martin-Gomez, Matthias Seibold, Michael Sommersperger, Tianyu Song, Alexander Winkler, Kevin Yu, Ulrich Eck, Nassir Navab, Alejandro Martin-Gomez, Matthias Seibold, Michael Sommersperger, Tianyu Song, Alexander Winkler, Kevin Yu, Ulrich Eck

Abstract

Three decades after the first set of work on Medical Augmented Reality (MAR) was presented to the international community, and ten years after the deployment of the first MAR solutions into operating rooms, its exact definition, basic components, systematic design, and validation still lack a detailed discussion. This paper defines the basic components of any Augmented Reality (AR) solution and extends them to exemplary Medical Augmented Reality Systems (MARS). We use some of the original MARS applications developed at the Chair for Computer Aided Medical Procedures and deployed into medical schools for teaching anatomy and into operating rooms for telemedicine and surgical guidance throughout the last decades to identify the corresponding basic components. In this regard, the paper is not discussing all past or existing solutions but only aims at defining the principle components and discussing the particular domain modeling for MAR and its design-development-validation process, and providing exemplary cases through the past in-house developments of such solutions.

Keywords: Artificial Intelligence; Augmented Reality; Medical Augmented Reality; acoustic sensing; computer vision; multi-modal sensing; perceptual visualization; sonification; surgical data science; validation.

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
The apparatus of Brunelleschi consists of (A) a mirror, and a painted or printed image. Both components inhibit a hole for the user to look through. (B) The apparatus creates a linear projection that shows the image inside the user’s view. (C) The illustrated third-person view visualizes the frustum that is covered by the projection. Many modern AR applications use the same mathematical basis for creating in situ visualizations.
Figure 2
Figure 2
The Medical Augmented Reality Framework consists of four primary components: Digital World, AR/VR Display, AR/VR User Interaction, and evaluation. An MARS perceives the Physical World with its sensors and processes in a medium that users may perceive and interact with through the AR/VR interfaces. Evaluation is integral to the system’s conception, development, and deployment.
Figure 3
Figure 3
Augmented Reality Teleconsultation System for Medicine (ArTekMed) combines point cloud reconstruction with Augmented and Virtual Reality. (A) Capturing the local site with the patient requires extrinsically calibrated RGB-D sensors from which the system computes a real-time point cloud reconstruction. (B) The local user interacts with the real world while perceiving additional virtual content delivered with AR. (C) The remote user dons a VR headset and controller for interacting with the acquired point cloud. (D) The reconstruction represents the digital world known to the computer and is displayed to the VR User. (E) AR annotations made by the VR user is shown in situ on the patient.
Figure 4
Figure 4
Interaction Techniques Unique to ArTekMed: (A) The Magnorama creates a dynamic 3D cutout from the real-time reconstruction and allows the user to interact and explore the space while intuitively creating annotations at the original region of interest within the duplicate. (B) The principle of Magnoramas translates well into AR. The resulting technique of Duplicated Reality allows co-located collaboration in tight spaces, even without a remote user. (C) For remote users to experience more details of the patient and their surroundings, ArTekMed deploys Projective Bisector Mirrors to bridge the gap between reality and reconstruction through the mirror metaphor.
Figure 5
Figure 5
Typical setups in ophthalmic surgery consist of a complex operating area and multi-modal real-time imaging. Visual and auditive AR applications aim to improve perception and provide additional information while avoiding visual clutter and reducing the cognitive load of complex intraoperative data.
Figure 6
Figure 6
CAMC aims to reduce the need for ionizing radiations and to provide spatially aware, intuitive visualization of joint optical and fluoroscopic data. (a) Calibration of the C-arm with the patient and the technician and surgeon’s HMD enables efficient surgical procedures in a collaborative ecosystem. (b) Advanced AR interface aids in better planning trajectories on the X-ray acquisitions. (c) The adaptive UI and augmentations in intra-operative planning and execution support various image-guided procedures.
Figure 7
Figure 7
The Magic Mirror visualizes anatomical structures in situ on the mirror reflection of the user in front of the RGB-D camera. Additionally, our Magic Mirror system displays transverse slices of a CT volume on the right half of the monitor that matches the slice selected by the user with their right hand.

References

    1. Kemp M. The Science of Art: Optical Themes in Western art from Brunelleschi to Seurat. Yale University Press; London, UK: 1992.
    1. International Year of Light: Ibn al Haytham, pioneer of modern optics celebrated at UNESCO. [(accessed on 12 June 2022)]. Available online: .
    1. The ’First True Scientist’. [(accessed on 12 June 2022)]. Available online: .
    1. Wootton D. The Invention of Science: A New History of the Scientific Revolution. Penguin; London, UK: 2015.
    1. Sielhorst T., Feuerstein M., Navab N. Advanced Medical Displays: A Literature Review of Augmented Reality. J. Disp. Technol. 2008;4:451–467. doi: 10.1109/JDT.2008.2001575.
    1. Birlo M., Edwards P.E., Clarkson M., Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med. Image Anal. 2022;77:102361. doi: 10.1016/j.media.2022.102361.
    1. The HoloLens in Medicine: A systematic Review and Taxonomy. arXiv. 20222209.03245
    1. Azuma R.T. A Survey of Augmented Reality. Presence Teleoperators Virtual Environ. 1997;6:355–385. doi: 10.1162/pres.1997.6.4.355.
    1. Bajura M., Fuchs H., Ohbuchi R. Merging virtual objects with the real world: Seeing ultrasound imagery within the patient. ACM SIGGRAPH Comput. Graph. 1992;26:203–210. doi: 10.1145/142920.134061.
    1. Clegg N. Making the Metaverse: What it Is, How it will Be Built, and why it Matters. 2022. [(accessed on 22 September 2022)]. Available online: .
    1. Özsoy E., Örnek E.P., Eck U., Tombari F., Navab N. Multimodal Semantic Scene Graphs for Holistic Modeling of Surgical Procedures. arXiv Prepr. 20212106.15309
    1. Özsoy E., Örnek E.P., Eck U., Czempiel T., Tombari F., Navab N. 4D-OR: Semantic Scene Graphs for OR Domain Modeling; Proceedings of the Medical Image Computing and Computer-Assisted Intervention MICCAI 2022; Singapore. 18–22 September 2022.
    1. Navab N., Traub J., Sielhorst T., Feuerstein M., Bichlmeier C. Action- and Workflow-Driven Augmented Reality for Computer-Aided Medical Procedures. IEEE Comput. Graph. Appl. 2007;27:10–14. doi: 10.1109/MCG.2007.117.
    1. Mezger U., Jendrewski C., Bartels M. Navigation in surgery. Langenbeck’s Arch. Surg. 2013;398:501–514. doi: 10.1007/s00423-013-1059-4.
    1. Okur A., Ahmadi S.A., Bigdelou A., Wendler T., Navab N. MR in OR: First analysis of AR/VR visualization in 100 intra-operative Freehand SPECT acquisitions; Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality; Basel, Switzerland. 26–29 October 2011; pp. 211–218.
    1. Matinfar S., Nasseri M.A., Eck U., Roodaki H., Navab N., Lohmann C.P., Maier M., Navab N. Surgical Soundtracks: Towards Automatic Musical Augmentation of Surgical Procedures; Proceedings of the Medical Image Computing and Computer-Assisted Intervention MICCAI 2017; Quebec City, QBC, Canada. 10–14 September 2017; pp. 673–681.
    1. Bichlmeier C., Wimmer F., Heining S.M., Navab N. Contextual Anatomic Mimesis Hybrid In-Situ Visualization Method for Improving Multi-Sensory Depth Perception in Medical Augmented Reality; Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality; Washington, DC, USA. 13–16 November 2007; pp. 129–138.
    1. Kutter O., Aichert A., Bichlmeier C., Michael R., Ockert B., Euler E., Navab N. Real-time Volume Rendering for High Quality Visualization; Proceedings of the International Workshop on Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS 2008); MICCAI Society, New York, NY, USA. 6–10 September 2008.
    1. Martin-Gomez A., Weiss J., Keller A., Eck U., Roth D., Navab N. The Impact of Focus and Context Visualization Techniques on Depth Perception in Optical See-Through Head-Mounted Displays. IEEE Trans. Vis. Comput. Graph. 2021:1. doi: 10.1109/TVCG.2021.3079849.
    1. Kalia M., Schulte zu Berge C., Roodaki H., Chakraborty C., Navab N. Interactive Depth of Focus for Improved Depth Perception. In: Zheng G., Liao H., Jannin P., Cattin P., Lee S.L., editors. Proceedings of the Medical Imaging and Augmented Reality; Tokyo, Japan. 1–2 August 2008; Cham, Switzerland: Springer International Publishing; 2016. pp. 221–232.
    1. Kalia M., Navab N., Fels S., Salcudean T. A Method to Introduce & Evaluate Motion Parallax with Stereo for Medical AR/MR; Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR); Osaka, Japan. 23–27 March 2019; pp. 1755–1759.
    1. Roodaki H., Navab N., Eslami A., Stapleton C., Navab N. SonifEye: Sonification of Visual Information Using Physical Modeling Sound Synthesis. IEEE Trans. Vis. Comput. Graph. 2017;23:2366–2371. doi: 10.1109/TVCG.2017.2734327.
    1. Ostler D., Seibold M., Fuchtmann J., Samm N., Feussner H., Wilhelm D., Navab N. Acoustic signal analysis of instrument–tissue interaction for minimally invasive interventions. Int. J. Comput. Assist. Radiol. Surg. 2020;15:771–779. doi: 10.1007/s11548-020-02146-7.
    1. Jones B., Sodhi R., Murdock M., Mehra R., Benko H., Wilson A., Ofek E., MacIntyre B., Raghuvanshi N., Shapira L. RoomAlive: Magical Experiences Enabled by Scalable, Adaptive Projector-camera Units; Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology; Honolulu, HI, USA. 5–8 October 2014; New York, NY, USA: ACM; 2014. pp. 637–644. UIST ’14.
    1. Navab N., Feuerstein M., Bichlmeier C. Laparoscopic Virtual Mirror New Interaction Paradigm for Monitor Based Augmented Reality; Proceedings of the 2007 IEEE Virtual Reality Conference; Charlotte, NC, USA. 10–14 March 2007; pp. 43–50.
    1. Bichlmeier C., Heining S.M., Rustaee M., Navab N. Virtually Extended Surgical Drilling Device: Virtual Mirror for Navigated Spine Surgery. In: Ayache N., Ourselin S., Maeder A., editors. Proceedings of the Medical Image Computing and Computer-Assisted Intervention—MICCAI 2007; Brisbane, Australia. 29 October–2 November 2007; 2007. pp. 434–441.
    1. Bichlmeier C., Heining S.M., Feuerstein M., Navab N. The Virtual Mirror: A New Interaction Paradigm for Augmented Reality Environments. IEEE Trans. Med. Imaging. 2009;28:1498–1510. doi: 10.1109/TMI.2009.2018622.
    1. Wendler T., Hartl A., Lasser T., Traub J., Daghighian F., Ziegler S.I., Navab N. Towards Intra-operative 3D Nuclear Imaging: Reconstruction of 3D Radioactive Distributions Using Tracked Gamma Probes. In: Ayache N., Ourselin S., Maeder A., editors. Proceedings of the Medical Image Computing and Computer-Assisted Intervention – MICCAI 2007; Brisbane, Australia. 29 October–2 November 2007; pp. 909–917.
    1. Dünser A., Billinghurst M. Handbook of Augmented Reality. Springer; Berlin/Heidelberg, Germany: 2011. Evaluating augmented reality systems; pp. 289–307.
    1. Lewis J.R. The System Usability Scale: Past, Present, and Future. Int. J. -Hum.-Comput. Interact. 2018;34:577–590. doi: 10.1080/10447318.2018.1455307.
    1. Hart S.G. Nasa-Task Load Index (NASA-TLX); 20 Years Later; Proceedings of the Human Factors and Ergonomics Society Annual Meeting; San Fransisco, CA, USA. 16–20 October 2006; pp. 904–908.
    1. Wilson M.R., Poolton J.M., Malhotra N., Ngo K., Bright E., Masters R.S. Development and validation of a surgical workload measure: The surgery task load index (SURG-TLX) World J. Surg. 2011;35:1961–1969. doi: 10.1007/s00268-011-1141-4.
    1. Baños R.M., Botella C., Garcia-Palacios A., Villa H., Perpiñá C., Alcaniz M. Presence and Reality Judgment in Virtual Environments: A Unitary Construct? CyberPsychol. Behav. 2000;3:327–335. doi: 10.1089/10949310050078760.
    1. Nowak K.L., Biocca F. The Effect Of The Agency And Anthropomorphism On Users’ Sense Of Telepresence, Copresence, And Social Presence In Virtual Environments. Presence Teleoperators Virtual Environ. 2003;12:481–494. doi: 10.1162/105474603322761289.
    1. Schafer W.A., Bowman D.A. Evaluating The Effects Of Frame Of Reference On Spatial Collaboration Using Desktop Collaborative Virtual Environments. Virtual Real. 2004;7:164–174. doi: 10.1007/s10055-004-0123-3.
    1. Georgiou Y., Kyza E.A. The Development And Validation Of The ARI Questionnaire: An Instrument For Measuring Immersion In Location-based Augmented Reality Settings. Int. J. Hum.-Comput. Stud. 2017;98:24–37. doi: 10.1016/j.ijhcs.2016.09.014.
    1. Luo H., Lee P.A., Clay I., Jaggi M., De Luca V. Assessment of fatigue using wearable sensors: A pilot study. Digit. Biomarkers. 2020;4:59–72. doi: 10.1159/000512166.
    1. Strak R., Yu K., Pankratz F., Lazarovici M., Sandmeyer B., Reichling J., Weidert S., Kraetsch C., Roegele B., Navab N., et al. Comparison Between Video-Mediated and Asymmetric 3D Teleconsultation During a Preclinical Scenario; Proceedings of the Proceedings of Mensch Und Computer 2021; Ingolstadt, Germany. 5–8 September 2021; New York, NY, USA: Association for Computing Machinery; 2021. pp. 227–235. MuC ’21.
    1. Roth D., Yu K., Pankratz F., Gorbachev G., Keller A., Lazarovici M., Wilhelm D., Weidert S., Navab N., Eck U. Real-time Mixed Reality Teleconsultation for Intensive Care Units in Pandemic Situations; Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)m; Lisbon, Portugal. 27 March–1 April 2021; pp. 693–694.
    1. Song T., Eck U., Navab N. If I Share with you my Perspective, Would you Share your Data with me?; Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW); Christchurch, New Zealand. 12–16 March 2022; pp. 666–667.
    1. Yu K., Gorbachev G., Eck U., Pankratz F., Navab N., Roth D. Avatars for Teleconsultation: Effects of Avatar Embodiment Techniques on User Perception in 3D Asymmetric Telepresence. IEEE Trans. Vis. Comput. Graph. 2021;27:4129–4139. doi: 10.1109/TVCG.2021.3106480.
    1. Yu K., Winkler A., Pankratz F., Lazarovici M., Wilhelm D., Eck U., Roth D., Navab N. Magnoramas: Magnifying Dioramas for Precise Annotations in Asymmetric 3D Teleconsultation; Proceedings of the 2021 IEEE Virtual Reality and 3D User Interfaces (VR); Lisboa, Portugal. 27 March–1 April 2021; pp. 392–401.
    1. Yu K., Eck U., Pankratz F., Lazarovici M., Wilhelm D., Navab N. Duplicated Reality for Co-located Augmented Reality Collaboration. IEEE Trans. Vis. Comput. Graph. 2022;28:2190–2200. doi: 10.1109/TVCG.2022.3150520.
    1. Yu K., Zacharis K., Eck U., Navab N. Projective Bisector Mirror (PBM): Concept and Rationale. IEEE Trans. Vis. Comput. Graph. 2022;28:3694–3704. doi: 10.1109/TVCG.2022.3203108.
    1. Pauly O., Diotte B., Fallavollita P., Weidert S., Euler E., Navab N. Machine learning-based augmented reality for improved surgical scene understanding. Comput. Med. Imaging Graph. 2015;41:55–60. doi: 10.1016/j.compmedimag.2014.06.007.
    1. Roodaki H., Filippatos K., Eslami A., Navab N. Introducing Augmented Reality to Optical Coherence Tomography in Ophthalmic Microsurgery; Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality; Fukuoka, Japan. 29 September–3 October 2015; pp. 1–6.
    1. Weiss J., Rieke N., Nasseri M.A., Maier M., Lohmann C.P., Navab N., Eslami A. Injection Assistance via Surgical Needle Guidance using Microscope-Integrated OCT (MI-OCT) Invest. Ophthalmol. Vis. Sci. 2018;59:287.
    1. Weiss J., Eck U., Nasseri M.A., Maier M., Eslami A., Navab N. Layer-Aware iOCT Volume Rendering for Retinal Surgery. In: Kozlíková B., Linsen L., Vázquez P.P., Lawonn K., Raidou R.G., editors. Proceedings of the Eurographics Workshop on Visual Computing for Biology and Medicine, The Eurographics Association; Brno, Czech Republic. 4–6 September 2019;
    1. Navab N., Mitschke M., Schütz O. Camera-augmented mobile C-arm (CAMC) application: 3D reconstruction using a low-cost mobile C-arm; Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention; Cambridge, UK. 19–22 September 1999; Berlin/Heidelberg, Germany: Springer; 1999. pp. 688–697.
    1. Navab N., Heining S.M., Traub J. Camera Augmented Mobile C-Arm (CAMC): Calibration, Accuracy Study, and Clinical Applications. IEEE Trans. Med. Imaging. 2010;29:1412–1423. doi: 10.1109/TMI.2009.2021947.
    1. Navab N., Bani-Kashemi A., Mitschke M. Merging visible and invisible: Two camera-augmented mobile C-arm (CAMC) applications; Proceedings of the Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99); San Francisco, CA, USA. 20–21 October 1999; pp. 134–141.
    1. Habert S., Gardiazabal J., Fallavollita P., Navab N. Rgbdx: First design and experimental validation of a mirror-based RGBD X-ray imaging system; Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality; Fukuoka, Japan. 29 September–3 October 2015; pp. 13–18.
    1. Lee S.C., Fuerst B., Fotouhi J., Fischer M., Osgood G., Navab N. Calibration of RGBD camera and cone-beam CT for 3D intra-operative mixed reality visualization. Int. J. Comput. Assist. Radiol. Surg. 2016;11:967–975. doi: 10.1007/s11548-016-1396-1.
    1. Hajek J., Unberath M., Fotouhi J., Bier B., Lee S.C., Osgood G., Maier A., Armand M., Navab N. Closing the calibration loop: An inside-out-tracking paradigm for augmented reality in orthopedic surgery; Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention; Granada, Spain. 16–20 September 2018; Berlin/Heidelberg, Germany: Springer; 2018. pp. 299–306.
    1. Fotouhi J., Mehrfard A., Song T., Johnson A., Osgood G., Unberath M., Armand M., Navab N. Development and pre-clinical analysis of spatiotemporal-aware augmented reality in orthopedic interventions. IEEE Trans. Med. Imaging. 2020;40:765–778. doi: 10.1109/TMI.2020.3037013.
    1. Fotouhi J., Unberath M., Song T., Hajek J., Lee S.C., Bier B., Maier A., Osgood G., Armand M., Navab N. Co-localized augmented human and X-ray observers in collaborative surgical ecosystem. Int. J. Comput. Assist. Radiol. Surg. 2019;14:1553–1563. doi: 10.1007/s11548-019-02035-8.
    1. Pauly O., Katouzian A., Eslami A., Fallavollita P., Navab N. Supervised classification for customized intraoperative augmented reality visualization; Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR); Atlanta, GA, USA. 5–8 November 2012; pp. 311–312.
    1. Paulus C.J., Haouchine N., Cazier D., Cotin S. Augmented Reality during Cutting and Tearing of Deformable Objects; Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality; Fukuoka, Japan. 29 September–3 October 2015; pp. 54–59.
    1. Pakhomov D., Premachandran V., Allan M., Azizian M., Navab N. Deep Residual Learning for Instrument Segmentation in Robotic Surgery. In: Suk H.I., Liu M., Yan P., Lian C., editors. Proceedings of the Machine Learning in Medical Imaging; Shenzhen, China. 13 October 2019; Cham, Switzerland: Springer International Publishing; 2019. pp. 566–573.
    1. Fotouhi J., Unberath M., Song T., Gu W., Johnson A., Osgood G., Armand M., Navab N. Interactive Flying Frustums (IFFs): Spatially aware surgical data visualization. Int. J. Comput. Assist. Radiol. Surg. 2019;14:913–922. doi: 10.1007/s11548-019-01943-z.
    1. Mitschke M., Bani-Hashemi A., Navab N. Interventions under video-augmented X-ray guidance: Application to needle placement; Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention; Pittsburgh, PA, USA. 11–14 October 2000; Berlin/Heidelberg, Germany: Springer; 2000. pp. 858–868.
    1. Traub J., Ahmadi S.A., Padoy N., Wang L., Heining S.M., Euler E., Jannin P., Navab N. Workflow Based Assessment of the Camera Augmented Mobile C-arm System; Proceedings of the AMIARCS workshop of MICCAI 2008; New York, NY, USA. 6–10 September 2008.
    1. Wang L., Landes J., Weidert S., Blum T., von der Heide A., Euler E., Navab N. First Animal Cadaver Study for Interlocking of Intramedullary Nails under Camera Augmented Mobile C-arm. In: Navab N., Jannin P., editors. Proceedings of the Information Processing in Computer-Assisted Interventions; Geneva, Switzerland. 23 June 2010; Berlin/Heidelberg, Germany: Springer; 2010. pp. 56–66.
    1. Weidert S., Wang L., von der Heide A., Navab N., Euler E. Intraoperative augmented reality visualization. Current state of development and initial experiences with the CamC. Unfallchirurg. 2012;115:209–213. doi: 10.1007/s00113-011-2121-8.
    1. Navab N., Blum T., Wang L., Okur A., Wendler T. First Deployments of Augmented Reality in Operating Rooms. Computer. 2012;45:48–55. doi: 10.1109/MC.2012.75.
    1. von der Heide A.M., Fallavollita P., Wang L., Sandner P., Navab N., Weidert S., Euler E. Camera-augmented mobile C-arm (CamC): A feasibility study of augmented reality imaging in the operating room. Int. J. Med. Robot. 2018;14:e1885. doi: 10.1002/rcs.1885.
    1. Fischer M., Fuerst B., Lee S.C., Fotouhi J., Habert S., Weidert S., Euler E., Osgood G., Navab N. Preclinical usability study of multiple augmented reality concepts for K-wire placement. Int. J. Comput. Assist. Radiol. Surg. 2016;11:1007–1014. doi: 10.1007/s11548-016-1363-x.
    1. Fotouhi J., Fuerst B., Lee S.C., Keicher M., Fischer M., Weidert S., Euler E., Navab N., Osgood G. Interventional 3D augmented reality for orthopedic and trauma surgery; Proceedings of the 16th Annual Meeting of the International Society for Computer Assisted Orthopedic Surgery (CAOS); Osaka, Japan. 8–11 June 2016.
    1. Maes P., Darrell T., Blumberg B., Pentland A. The ALIVE system: Wireless, Full-body Interaction with Autonomous Agents. Multimed. Syst. 1997;5:105–112. doi: 10.1007/s005300050046.
    1. Blum T., Kleeberger V., Bichlmeier C., Navab N. Mirracle: An Augmented Reality Magic Mirror System For Anatomy Education; Proceedings of the 2012 IEEE Virtual Reality Workshops (VRW); Costa Mesa, CA, USA. 4–8 March 2012; pp. 115–116.
    1. Meng M., Fallavollita P., Blum T., Eck U., Sandor C., Weidert S., Waschke J., Navab N. Kinect for interactive AR anatomy learning; Proceedings of the 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR); Adelaide, SA, Australia. 1–4 October 2013; pp. 277–278.
    1. Bork F., Barmaki R., Eck U., Fallavolita P., Fuerst B., Navab N. Exploring Non-reversing Magic Mirrors for Screen-based Augmented Reality Systems; Proceedings of the 2017 IEEE Virtual Reality (VR); Los Angeles, CA, USA. 18–22 March 2017; pp. 373–374.
    1. Barmaki R., Yu K., Pearlman R., Shingles R., Bork F., Osgood G.M., Navab N. Enhancement of Anatomical Education Using Augmented Reality: An Empirical Study of Body Painting. Anat. Sci. Educ. 2019;12:599–609. doi: 10.1002/ase.1858.
    1. Bork F., Stratmann L., Enssle S., Eck U., Navab N., Waschke J., Kugelmann D. The Benefits of an Augmented Reality Magic Mirror System for Integrated Radiology Teaching in Gross Anatomy. Anat. Sci. Educ. 2019;12:585–598. doi: 10.1002/ase.1864.

Source: PubMed

3
Tilaa