A top priority for people with paralysis is regaining reach and grasp ability. Technologies such as robotic arm prostheses or electrically stimulating paralyzed muscles can, in principle, meet this need – but they require a better way for the user to command their desired movements. Existing methods, which rely on the remaining muscles, are unintuitive and require laborious sequences of simple commands. Reading out a patient’s desired movement directly from their brain could overcome these limitations by allowing them to command the whole arm and hand naturally and simultaneously simply by thinking about moving. My research is focused on finding a way to obtain such “high degree-of-freedom” command signals from the brain.
As part of the BrainGate clinical trial, we implant sensors into motor areas of paralyzed volunteers’ cortex. This lets us detect neural activity related to the person imagining movements. We then ‘decode’ this activity, via a computer algorithm, into a movement command. I started this project by extending state-of-the-art decoding techniques (recently used to accurately control a 2D computer mouse) to allow our participants to control the 3D position and 2D orientation of a robotic arm. My initial goal is to push this method to its limit by asking the participant to imagine a wider range of arm movements. Inevitably, we will encounter imagined movements that existing methods cannot reliably decode from neural activity. I will then test two hypotheses, both inspired by recent advances in movement systems neuroscience, to identify candidate neural readout signals that the person can, through practice, hone into a useful command signal. Finally, I will close the neuroscience -> clinical application -> neuroscience loop by answering fundamental questions about the flexibility and function of human motor cortex. This will provide a generalizable strategy for increasing the complexity of movements afforded by neural prostheses.