A top priority for people with paralysis is regaining the ability to reach and grasp. Robotic arm prostheses or electrically stimulating paralyzed muscles can, in principle, meet this need – but require a better way for the patient to control them. Existing methods rely on laborious and unintuitive sequences of simple commands using the remaining muscles (e.g., moving the tongue or eyes). Reading out the person’s desired movement directly from their brain could overcome these limitations by allowing them to command the whole arm and hand naturally and simultaneously simply by thinking about moving. My research is focused on finding a way to obtain such “high degree-of-freedom” command signals from the brain.
As part of the BrainGate clinical trial, we implant sensors into motor areas of paralyzed volunteers’ cortex. This lets us detect neural activity related to the person imagining movements. We then ‘decode’ this activity, via a computer algorithm, into a movement command. I started this project by extending state-of-the-art decoding techniques (recently used to accurately control a 2D computer mouse) to allow our participants to control the 3D position and 2D orientation of a robotic arm. My initial goal is to push this method to its limit by asking the participant to imagine a wider range of arm movements. Inevitably, we will encounter imagined movements that existing methods cannot reliably decode from neural activity because we don’t happen to record from the right neurons. I will then test two strategies, both inspired by recent advances in movement systems neuroscience, to identify candidate neural signals that the person can, through practice, hone into a useful command signal. This will provide a generalizable way to increase the range and complexity of movements that neural prostheses can restore to people with paralysis, so that they can independently perform activities of daily living.