Intracortical brain-computer interfaces (iBCIs) can restore lost communication and motor function for people with severe speech and motor impairment due to neurological injury or disease. iBCIs measure neural activity from the brain, decode this activity into control signals, and use these signals to guide prosthetic devices such as computer cursors and prosthetic arms. This control has predominantly been accomplished by placing small sensors that penetrate the brain in areas believed to represent movements of a particular body part as governed by a traditional brain map that details how face, arm and leg movements are represented in distinct brain areas. Contrary to this traditional view, we recently found representation of all body parts intermixed within a small patch of the “arm/hand” area of the brain. Leveraging this finding, I created the first iBCI system capable of classifying movements across all four limbs from neural activity. This demonstration allows us to start considering ways to restore movements across the entire body using implants restricted to one small area of motor cortex. One specific application is restoring multi-limb motion towards a whole-body iBCI, which may in the future allow for the neural control of exoskeleton suits or humanoid robots as proxies for the paralyzed body.
I aim to address this problem in three ways: 1) characterizing how the brain codes for simultaneous multi-limb movements, and how that code differs from movements of single body parts, 2) developing sophisticated neural decoders based on deep learning techniques which leverage how the brain codes for complex multi-limb movements, and 3) designing iBCI systems capable of real-time control of multiple effectors such as multiple cursors for point-and-click typing, or a humanoid avatar in virtual reality. This work will lay the foundation of a rich future in iBCI research aimed at restoring whole-body motion for people with paralysis.