Analysis of the accuracy and robustness of the leap motion controller

Frank Weichert, Daniel Bachmann, Bartholomäus Rudak, Denis Fisseler, Frank Weichert, Daniel Bachmann, Bartholomäus Rudak, Denis Fisseler

Abstract

The Leap Motion Controller is a new device for hand gesture controlled user interfaces with declared sub-millimeter accuracy. However, up to this point its capabilities in real environments have not been analyzed. Therefore, this paper presents a first study of a Leap Motion Controller. The main focus of attention is on the evaluation of the accuracy and repeatability. For an appropriate evaluation, a novel experimental setup was developed making use of an industrial robot with a reference pen allowing a position accuracy of 0.2 mm. Thereby, a deviation between a desired 3D position and the average measured positions below 0.2 mm has been obtained for static setups and of 1.2 mm for dynamic setups. Using the conclusion of this analysis can improve the development of applications for the Leap Motion controller in the field of Human-Computer Interaction.

Figures

Figure 1.
Figure 1.
Visualization of a (a) Real (using Infrared Imaging) and (b) Schematic View of Leap Motion Controller.
Figure 2.
Figure 2.
Visualization of the Robot Cell Consisting of the Leap Motion Controller, an Industrial Robot (Kuka Robot KR 125/3) with a Reference Pen: (a) Front View and (b) Schematic View with a Coordinate System.
Figure 3.
Figure 3.
Visualization of the Basic Test Cases: (a) Positions in the xy-, xz- and yz-Plane and (b) Positions on a Sinus Function Within the xy-Plane.
Figure 4.
Figure 4.
Analysis of the Accuracy and the Repeatability: Deviation between a desired 3D Position and the Measured Positions for a Static Position, (a) xy-Variation; (b) xz-Variation; (c) yz-Variation.
Figure 5.
Figure 5.
Box-and-whisker Plots for different Tool Diameters: d = 3 mm, d = 4 mm, d = 5 mm, d = 6 mm, d = 8 mm, d = 10 mm of the average Deviation concerning (a) x-; (b) y- and (c) z-Axis.
Figure 6.
Figure 6.
Long Time Measurement: Change of (a) x-, (b) y- and (c) z-Coordinate over Time.
Figure 7.
Figure 7.
Analysis of Accuracy: Deviation between a Desired 3D Position and the Median of the Measured Positions, (a) xy-Plane; (b) xz-Plane; (c) yz-Plane.
Figure 8.
Figure 8.
Analysis of Accuracy of the Measured Positions. The fixed Orientations are marked in red. (a) xy-Plane; (b) xz-Plane; (c) yz-Plane.
Figure 9.
Figure 9.
Analysis of Accuracy as Bland-Altman Plots for the three axis-aligned movements of the Test Bar of the Robot for a measuring range from −100 mm to 100 mm. (a) x-axis; (b) y-axis; (c) z-axis.
Figure 10.
Figure 10.
Analysis of Accuracy as Bland-Altman Plots for a Sinus Shaped Motion of the Test Bar of the Robot for a measuring range from − 25 mm to 25 mm. (a) x-Deviation; (b) y-Deviation; (c) z-Deviation.

References

    1. Khoshelham K., Elberink S.O. Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors. 2012;12:1437–1454.
    1. Biswas K.K., Basu S. Gesture Recognition using Microsoft Kinect. Proceedings of the IEEE International Conference on Automation, Robotics and Applications (ICARA); Delhi, India. 6–8 December 2011.
    1. Stoyanov T., Louloudi A., Andreasson H., Lilienthal A.J. Comparative Evaluation of Range Sensor Accuracy in Indoor Environments. Proceedings of the European Conference on Mobile Robots (ECMR); Örebro, Sweden. 7–9 September 2011.
    1. Chng E. New Ways of Accessing Information Spaces Using 3D Multitouch Tables. Proceedings of the International Conference on Cyberworlds (CW); Birmingham, UK. 25–27 September 2012; pp. 144–150.
    1. Bruder G., Steinicke F., Stuerzlinger W. Effects of Visual Conflicts on 3D Selection Task Performance in Stereoscopic Display Environments. Proceedings of IEEE Symposium on 3D User Interfaces (3DUI); Orlando, FL, USA. 16–17 March 2013; pp. 1–4.
    1. Silberman N., Fergus R. Indoor Scene Segmentation Using a Structured Light Sensor. Proceeding of IEEE International Conference on Computer Vision Workshops (ICCV Workshops); New York, NY, USA. 6–13 November 2011; pp. 601–608.
    1. Chen F., Brown G., Song M. Overview of three-dimensional shape measurement using optical methods. Opt. Eng. 2000;39:10–22.
    1. Kolb A., Barth E., Koch R., Larsen R. Time-of-Flight Sensors on Computer Graphics. Proceedings of the Eurographics (State-of-the-Art Report); Munich, Germany. 30 March–3 April 2009.
    1. Ambrosch K., Kubinger W. Accurate hardware-based stereo vision. Comput. Vis. Image Underst. 2010;114:1303–1316.
    1. Schmidt J., Berg D.R., Ploeg L., Ploeg H.L. Precision, repeatability and accuracy of Optotrak®optical motion tracking systems. Int. J. Exp. Comput. Biomech. 2009;1:114–127.
    1. Lindner M., Schiller I., Kolb A., Koch R. Time-of-flight sensor calibration for accurate range sensing. Comput. Vis. Image Underst. 2010;114:1318–1328.
    1. Weingarten J.W., Gruener G., Siegwari R. A State-of-the-art 3D Sensor for Robot Navigation. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems; Sendai, Japan. 28 September–2 October 2004; pp. 2155–2160.
    1. Rapp H., Frank M., Hamprecht F.A., Jahne B. A theoretical and experimental investigation of the systematic errors and statistical uncertainties of time of flight cameras. Int. J. Intell. Syst. Technol. Appl. 2008;5:402–413.
    1. Chiabrando F., Chiabrando R., Piatti D., Rinaudo F. Sensors for 3D imaging: Metric evaluation and calibration of a CCD/CMOS time-of-flight camera. Sensors. 2009;9:10080–10096.
    1. Rohling R., Munger P., Hollerbach J.M., Peters T. Comparison of relative accuracy between a mechanical and an optical position tracker for image-guided neurosurgery. Comput. Aid. Surg. 1994:277–282.
    1. Koivukangas T., Katisko J., Koivukangas J. Technical accuracy of optical and the electromagnetic tracking systems. SpringerPlus. 2013;2 doi: 10.1186/2193-1801-2-90.
    1. Sturman M.M., Vaillancourt D.E., Corcos D. Effects of aging on the regularity of physiological tremor. J. Neurophysiol. 2005;93:3064–3074.
    1. Burkhard P., Langston J., Tetrud J. Voluntarily simulated tremor in normal subjects. Neurophysiol. Clin. 2002;32:119–126.
    1. Nof S. Handbook of Industrial Robotics. John Wiley & Sons; Hoboken, NJ, USA: 1999.
    1. Tsai R., Lenz R. A new technique for fully autonomous and efficient 3D robotics hand/eye calibration. IEEE Trans. Robot. Autom. 1989;5:345–358.
    1. BIPM What is Metrology. [(accessed on 17 February 2013)]. Available online: .
    1. Manipulating Industrial Robots. Performance Criteria and Related Test Methods. 1998. ISO 9283.
    1. Manipulating Industrial Robots-Informative Guide on Test Equipment and Metrology Methods of Operation for Robot Performance Evaluation in Accordance with ISO 9283. 1995. ISO/TR 13309.
    1. Tukey J. W. In: Exploratory Data Analysis. Limited, P., editor. Addison-Wesley Publishing; Philippines: 1977.
    1. Bland J., Altman D. Measuring agreement in method comparison studies. Stat. Methods Med. Res. 1999;8:135–160.
    1. Fitts P. The information capacity of the human motor system in controlling the amplitude of movement. J. Exp. Psychol. 1954;47:381–391.

Source: PubMed

3
Subscribe