ARssist: augmented reality on a head-mounted display for the first assistant in robotic surgery

Long Qian, Anton Deguet, Peter Kazanzides, Long Qian, Anton Deguet, Peter Kazanzides

Abstract

In robot-assisted laparoscopic surgery, the first assistant (FA) is responsible for tasks such as robot docking, passing necessary materials, manipulating hand-held instruments, and helping with trocar planning and placement. The performance of the FA is critical for the outcome of the surgery. The authors introduce ARssist, an augmented reality application based on an optical see-through head-mounted display, to help the FA perform these tasks. ARssist offers (i) real-time three-dimensional rendering of the robotic instruments, hand-held instruments, and endoscope based on a hybrid tracking scheme and (ii) real-time stereo endoscopy that is configurable to suit the FA's hand-eye coordination when operating based on endoscopy feedback. ARssist has the potential to help the FA perform his/her task more efficiently, and hence improve the outcome of robot-assisted laparoscopic surgeries.

Keywords: ARssist; FA's hand-eye coordination; augmented reality; augmented reality application; endoscopes; endoscopy feedback; first assistant; hand-held instruments; helmet mounted displays; hybrid tracking scheme; medical robotics; optical see-through head-mounted display; real-time stereo endoscopy; real-time three-dimensional rendering; rendering (computer graphics); robot docking; robot-assisted laparoscopic surgery; robotic instruments; stereo image processing; surgery; trocar planning.

Figures

Fig. 1
Fig. 1
Surgery team with a da Vinci S® surgical robot; image © 2018 Intuitive Surgical, Inc.
Fig. 2
Fig. 2
Components of ARssist and their relative transformations
Fig. 3
Fig. 3
Illustration of display calibration in ARssist
Fig. 4
Fig. 4
Visualisation results of ARssist a Transparent body phantom b Before display calibration c With display calibration d Overlay with a hand-held instrument e Virtual monitor visualisation of the endoscopy f Endoscopy visualisation registered with viewing frustum
Fig. 5
Fig. 5
dVRK setup
Fig. 6
Fig. 6
Setup of eye-simulating cameras for obtaining visualisation results (Fig. 4) of ARssist
Fig. 7
Fig. 7
Data flow in ARssist
Fig. 8
Fig. 8
Fiducial markers on robotic arms and hand-held instrument

References

    1. Kumar R., Hemal A.K.: ‘The ‘scrubbed surgeon’ in robotic surgery’, World J. Urol., 2006, 24, (2), pp. 144–147 (doi: 10.1007/s00345-006-0068-0)
    1. Martin S.: ‘The role of the first assistant in robotic assisted surgery’, Br. J. Perioper. Nurs., 2004, 14, (4), pp. 159–163
    1. Sgarbura O., Vasilescu C.: ‘The decisive role of the patient-side surgeon in robotic surgery’, Surg. Endosc., 2010, 24, (12), pp. 3149–3155 (doi: 10.1007/s00464-010-1108-9)
    1. Nayyar R., Yadav S., Singh P., et al. : ‘Impact of assistant surgeon on outcomes in robotic surgery’, Indian J. Urol., 2016, 32, (3), p. 204 (doi: 10.4103/0970-1591.185095)
    1. Sung G.T., Gill I.S.: ‘Robotic laparoscopic surgery: a comparison of the da Vinci and Zeus systems’, Urology, 2001, 58, (6), pp. 893–898 (doi: 10.1016/S0090-4295(01)01423-6)
    1. Rolland J.P., Fuchs H.: ‘Optical versus video see-through head-mounted displays in medical visualization’, Presence, Teleoperators Virtual Environ., 2000, 9, (3), pp. 287–309 (doi: 10.1162/105474600566808)
    1. Qian L., Barthel A., Johnson A., et al. : ‘Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display’, Int. J. Comput. Assisted Radiol. Surg., 2017, 12, (6), pp. 901–910 (doi: 10.1007/s11548-017-1564-y)
    1. Chen L., Day T.W., Tang W., et al. : ‘Recent developments and future challenges in medical mixed reality’. IEEE Intl. Symp. on Mixed and Augmented Reality (ISMAR), Nantes, France, October 2017, pp. 123–135
    1. Bernhardt S., Nicolau S.A., Soler L., et al. : ‘The status of augmented reality in laparoscopic surgery as of 2016’, Med. Image Anal., 2017, 37, pp. 66–90 (doi: 10.1016/j.media.2017.01.007)
    1. Wentink B.: ‘Eye-hand coordination in laparoscopy – an overview of experiments and supporting aids’, Minim Invasive Ther. Allied Technol., 2001, 10, (3), pp. 155–162 (doi: 10.1080/136457001753192277)
    1. Lo B., Chung A.J., Stoyanov D., et al. : ‘Real-time intraoperative 3D tissue deformation recovery’. IEEE Intl. Symp. on Biomedical Imaging: From Nano to Macro (ISBI), Paris, France, May 2008, pp. 1387–1390
    1. Maier-Hein L., Mountney P., Bartoli A., et al. : ‘Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery’, Med. Image Anal., 2013, 17, (8), pp. 974–996 (doi: 10.1016/j.media.2013.04.003)
    1. Koreeda Y., Obata S., Nishio Y., et al. : ‘Development and testing of an endoscopic pseudo-viewpoint alternating system’, Int. J. Comput. Assisted Radiol. Surg., 2015, 10, (5), pp. 619–628 (doi: 10.1007/s11548-014-1083-z)
    1. Koppel D., Wang Y.-F., Lee H.: ‘Image-based rendering and modeling in videoendoscopy’. IEEE Intl. Symp. on Biomedical Imaging: Nano to Macro, Arlington, VA, USA, April 2004, pp. 269–272
    1. Buchs N.C., Volonte F., Pugin F., et al. : ‘Augmented environments for the targeting of hepatic lesions during image-guided robotic liver surgery’, J. Surg. Res., 2013, 184, (2), pp. 825–831 (doi: 10.1016/j.jss.2013.04.032)
    1. ‘Microsoft hololens’, Available at , accessed: 6 June 2018.
    1. ‘Meta’, Available at , accessed: 6 June 2018.
    1. Thrun S., Leonard J.J.: ‘Simultaneous localization and mapping’, in Siciliano B., Khatib O. (Eds.): ‘Springer handbook of robotics’ (Springer, Berlin & Heidelberg, 2008), pp. 871–889
    1. Wang J., Qian L., Azimi E., et al. : ‘Prioritization and static error compensation for multi-camera collaborative tracking in augmented reality’. IEEE Virtual Reality (VR), Los Angeles, CA, USA, March 2017, pp. 335–336
    1. Kwartowitz D.M., Herrell S.D., Galloway R.L.: ‘Toward image-guided robotic surgery: determining intrinsic accuracy of the da Vinci robot’, Int. J. Comput. Assist. Radiol. Surg., 2006, 1, (3), pp. 157–165 (doi: 10.1007/s11548-006-0047-3)
    1. Abawi D.F., Bienwald J., Dorner R.: ‘Accuracy in optical tracking with fiducial markers: an accuracy function for ARToolKit’. Proc. of the 3rd IEEE/ACM Int. Symp. on Mixed and Augmented Reality, Arlington, VA, USA, November 2004, pp. 260–261
    1. Tuceryan M., Genc Y., Navab N.: ‘Single-point active alignment method (SPAAM) for optical see-through HMD calibration for augmented reality’, Presence, Teleoperators Virtual Environ., 2002, 11, (3), pp. 259–276 (doi: 10.1162/105474602317473213)
    1. Itoh Y., Klinker G.: ‘Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization’. IEEE Symp. on 3D User Interfaces (3DUI), Minneapolis, MN, USA, March 2014, pp. 75–82
    1. Owen C.B., Zhou J., Tang A., et al. : ‘Display-relative calibration for optical see-through head-mounted displays’. IEEE/ACM Intl. Symp. on Mixed and Augmented Reality (ISMAR), Arlington, VA, USA, November 2004, pp. 70–78
    1. Qian L., Azimi E., Kazanzides P., et al. : ‘Comprehensive tracker based display calibration for holographic optical see-through head-mounted display’, 2017, arXiv:1703.05834
    1. Hartley R., Zisserman A.: ‘Multiple view geometry in computer vision’ (Cambridge University Press, New York, NY, USA, 2003)
    1. Qian L., Unberath M., Yu K., et al. : ‘Towards virtual monitors for image guided interventions-real-time streaming to optical see-through head-mounted displays’, 2017, arXiv:1710.00808
    1. Kazanzides P., Chen Z., Deguet A., et al. : ‘An open-source research kit for the da Vinci R surgical system’. IEEE Intl. Conf. on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014, pp. 6434–6439
    1. DiMaio S., Hasser C.: ‘The da Vinci research interface’. MICCAI Workshop on Systems and Arch. for Computer Assisted Interventions, Midas Journal, 2008, Available at
    1. Quigley M., Conley K., Gerkey B., et al. : ‘ROS: an open-source robot operating system’. ICRA Workshop on Open Source Software, Kobe, Japan, 2009
    1. Kato H., Billinghurst M.: ‘Marker tracking and HMD calibration for a video-based augmented reality conferencing system’. IEEE/ACM Intl. Workshop on Augmented Reality (IWAR), San Francisco, CA, USA, October 1999, pp. 85–94
    1. ‘Hololensartoolkit’. Available at , accessed: 6 June 2018
    1. ‘Locatable camera’. Available at , accessed: 6 June 2018
    1. Fontanelli G., Ficuciello F., Villani L., et al. : ‘Da Vinci research kit: PSM and MTM dynamic modelling’. IROS Workshop on Shared Platforms for Medical Robotics Research, Vancouver, Canada, 2017
    1. Azimi E., Qian L., Kazanzides P., et al. : ‘Robust optical see-through head-mounted display calibration: taking anisotropic nature of user interaction errors into account’. IEEE Virtual Reality (VR 2017), Los Angeles, CA, USA, March 2017, pp. 219–220
    1. Edgcumbe P., Pratt P., Yang G.-Z., et al. : ‘Pico Lantern: surface reconstruction and augmented reality in laparoscopic surgery using a pick-up laser projector’, Med. Image Anal., 2015, 25, (1), pp. 95–102 (doi: 10.1016/j.media.2015.04.008)
    1. Vagvolgyi B., Niu W., Chen Z., et al. : ‘Augmented virtuality for model-based teleoperation’. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017, pp. 3826–3833
    1. Itoh Y., Hamasaki T., Sugimoto M.: ‘Occlusion leak compensation for optical see-through displays using a single-layer transmissive spatial light modulator’, IEEE Trans. Visual. Comput. Graphics, 2017, 23, (11), pp. 2463–2473 (doi: 10.1109/TVCG.2017.2734427)

Source: PubMed

Подписаться