Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping

John E Downey, Jeffrey M Weiss, Katharina Muelling, Arun Venkatraman, Jean-Sebastien Valois, Martial Hebert, J Andrew Bagnell, Andrew B Schwartz, Jennifer L Collinger, John E Downey, Jeffrey M Weiss, Katharina Muelling, Arun Venkatraman, Jean-Sebastien Valois, Martial Hebert, J Andrew Bagnell, Andrew B Schwartz, Jennifer L Collinger

Abstract

Background: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object.

Methods: Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps.

Results: Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control.

Conclusions: Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users.

Trial registration: NCT01364480 and NCT01894802 .

Keywords: Assistive technology; Brain-computer interface; Brain-machine interface; Neuroprosthetic; Shared mode control.

Figures

Fig. 1
Fig. 1
Array location. The approximate location of the microelectrode recording arrays for both subjects on a template brain. Subject 1 had 2 96-channel arrays implanted in M1 (green squares). Subject 2 had 2 88-channel arrays implanted in S1 (yellow squares) and 2 32-channel arrays implanted more posterior (yellow rectangles)
Fig. 2
Fig. 2
Shared control system diagram and robot testing set up. a System diagram for the vision-guided shared control. The blue boxes show the BMI system decoding endpoint translational and grasp velocity. The green boxes show the components of the vision-guided robotic system for grasping. If shared control was not in use, only the output of the BMI system was used to send commands to the arm, but with shared control, the control signal of the vision-guided system was blended with that of the BMI system to create the final robot command. b The 7.5 cm cube (yellow) and the target box (clear box) were positioned on the table, as shown, to start the ARAT trials. The subject sat approximately 1 m to the left of the robot. c An example of the central cross-section of the grasp envelope for a stable grasp position on a 7.5 cm cube is outlined by the blue dotted line. The shading shows the gradient of shared control (α value), with white areas being completely controlled by the BMI user and darker areas having more robot control. d A trial progression schematic showing when translation and grasp control are under BMI control (blue) or robot control (green). Wrist orientation was always maintained in a neutral posture under computer control
Fig. 3
Fig. 3
Target positions for the multiple object task. The 7.5 cm target cubes filled the squares in the diagram and were separated by 10 cm. For a single trial, the cubes were placed at 2 positions connected by dashed lines, and the subject was instructed to pick up 1 of the 2 cubes. The position numbers correspond to the target numbers in Table 2. The cube in Fig. 2b is at the same point on the table as the intersection of the dashed lines here. The “Cameras” box and hand position arrow indicate the location of those components of the robot at the start of the trial
Fig. 4
Fig. 4
ARAT performance and difficulty. a The frequency of each trial result for Subject 1 (left) and Subject 2 (right). Completion times are shown for successful trials and the failure mode (time out or out of bounds) is noted for failed trials. Assisted (blue bars) and unassisted (red bars) trials are shown separately. b The frequency of each reported difficulty score for assisted and unassisted trial sets (1 = extremely easy, 10 = extremely difficult). Both subjects were more successful and reported that the task was easier during the trials with shared control
Fig. 5
Fig. 5
Analysis of trajectory properties with and without shared control. a A box plot distribution of hand translation speeds across all time bins while the hand was less than 10 cm above the table during successful trials. The red line is the median speed, the blue box show the interquartile region, and the whiskers span the 5th-95th percentile. The speed distribution for assisted trials for both subjects skews low indicating that the hand was steadier when approaching the object. b Subject 1’s path lengths during successful trials, first for the full trials, then separated by the path length before the first grasp attempt and the path length after the object was grasped. Error bars span the interquartile region. The assisted trials benefit the most during the pre-grasp portion of the trial. c Subject 1’s hand trajectories with median path lengths for their assistance condition. The color shows the grasp aperture. The release point is marked where the hand opened to allow the object to drop onto the platform. We did not specify to the subject how the object had to be placed, or released, onto the platform. Additional file 2: Movie S2 shows both trials

References

    1. Hochberg LR, Bacher D, Jarosiewicz B, Masse NY, Simeral JD, Vogel J, et al. Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature. 2012;485:372–5. doi: 10.1038/nature11076.
    1. Collinger JL, Wodlinger B, Downey JE, Wang W, Tyler-Kabara EC, Weber DJ, et al. High-performance neuroprosthetic control by an individual with tetraplegia. Lancet. 2012;6736:1–8.
    1. Wodlinger B, Downey JE, Tyler-Kabara EC, Schwartz AB, Boninger ML, Collinger JL. Ten-dimensional anthropomorphic arm control in a human brain-machine interface: difficulties, solutions, and limitations. J Neural Eng. 2014;12:016011. doi: 10.1088/1741-2560/12/1/016011.
    1. Kim HK, Park S, Srinivasan MA. Developments in brain-machine interfaces from the perspective of robotics. Hum Mov Sci. 2009;28:191–203. doi: 10.1016/j.humov.2008.12.001.
    1. Chestek CA, Gilja V, Nuyujukian P, Foster JD, Fan JM, Kaufman MT, et al. Long-term stability of neural prosthetic control signals from silicon cortical arrays in rhesus macaque motor cortex. J Neural Eng. 2011;8:045005. doi: 10.1088/1741-2560/8/4/045005.
    1. Perge JA, Homer ML, Malik WQ, Cash S, Eskandar E, Friehs G, et al. Intra-day signal instabilities affect decoding performance in an intracortical neural interface system. J Neural Eng. 2013;10:036004. doi: 10.1088/1741-2560/10/3/036004.
    1. Johansson RS, Flanagan JR. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat Rev Neurosci. 2009;10:345–59. doi: 10.1038/nrn2621.
    1. Bensmaia SJ, Miller LE. Restoring sensorimotor function through intracortical interfaces: progress and looming challenges. Nat Rev Neurosci. 2014;15:313–25. doi: 10.1038/nrn3724.
    1. Cusack WF, Patterson R, Thach S, Kistenberg RS, Wheaton LA. Motor performance benefits of matched limb imitation in prosthesis users. Exp Brain Res. 2014;232:2143–54. doi: 10.1007/s00221-014-3904-2.
    1. Grest D, Woetzel J, Koch R, Kiel C. Nonlinear body pose estimation from depth images. Proceedings of the 27th DAGM conference on Pattern Recognition. 2005. p. 285–92.
    1. Toshev A, Makadia A, Daniilidis K: Shape-based object recognition in videos using 3D synthetic object models. 2009 IEEE Comput Soc Conf Comput Vis Pattern Recognit Work CVPR Work 2009 2009:288–295
    1. Bagnell JA, Cavalcanti F, Cui L, Galluzzo T, Hebert M, Kazemi M, et al.: An integrated system for autonomous robotics manipulation. IEEE Int Conf Intell Robot Syst 2012:2955–2962.
    1. Bell CJ, Shenoy P, Chalodhorn R, Rao RPN. Control of a humanoid robot by a noninvasive brain-computer interface in humans. J Neural Eng. 2008;5:214–20. doi: 10.1088/1741-2560/5/2/012.
    1. McMullen DP, Hotson G, Katyal KD, Wester BA, Fifer MS, McGee TG, et al. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic. IEEE Trans Neural Syst Rehabil Eng. 2014;22:784–96.
    1. Vogel J, Haddadin S, Jarosiewicz B, Simeral JD, Bacher D, Hochberg LR, et al.: An assistive decision-and-control architecture for force-sensitive hand-arm systems driven by human-machine interfaces. Int J Rob Res 2015:1–18
    1. Katyal KD, Johannes MS, Kellis S, Aflalo T, Klaes C, Mcgee TG, et al. IEEE International Conference on Systems, Man, and Cybernetics. 2014. A collaborative BCI approach to autonomous control of a prosthetic limb system; pp. 1479–82.
    1. Collinger JL, Boninger ML, Bruns TM, Curley K, Wang W, Weber DJ. Functional priorities, assistive technology, and brain-computer interfaces after spinal cord injury. J Rehabil Res Dev. 2013;50:145–60. doi: 10.1682/JRRD.2011.11.0213.
    1. Huggins JE, Moinuddin AA, Chiodo AE, Wren PA. What would brain-computer interface users want: opinions and priorities of potential users with spinal cord injury. Arch Phys Med Rehabil. 2015;96:S38–S45.e5. doi: 10.1016/j.apmr.2014.05.028.
    1. Kim DJ, Hazlett-Knudsen R, Culver-Godfrey H, Rucks G, Cunningham T, Portée D, et al. How autonomy impacts performance and satisfaction: results from a study with spinal cord injured subjects using an assistive robot. IEEE Trans Syst Man, Cybern Syst. 2012;42:2–14. doi: 10.1109/TSMCA.2011.2159589.
    1. Microelectrode brain-machine interface for individuals with tetraplegia. [].
    1. Cortical recording and stimulating array brain-machine interface. [].
    1. Lyle RC. A performance test for assessment of upper limb function in physical rehabilitation treatment and research. Int J Rehabil Res. 1981;4:483–92. doi: 10.1097/00004356-198112000-00001.
    1. Boninger M, Mitchell G, Tyler-Kabara E, Collinger J, Schwartz AB. Neuroprosthetic control and tetraplegia–authors’ reply. Lancet. 2013;381:1900–1. doi: 10.1016/S0140-6736(13)61154-X.
    1. Velliste M, Perel S, Spalding MC, Whitford AS, Schwartz AB. Cortical control of a prosthetic arm for self-feeding. Nature. 2008;453:1098–101. doi: 10.1038/nature06996.
    1. Muelling K, Venkatraman A, Valois J, Downey J, Weiss J, Javdani S, et al. Robotics: science and systems. 2015. Autonomy infused teleoperation with application to BCI manipulation.

Source: PubMed

3
Sottoscrivi