Understanding the brain requires understanding how the neurons that constitute it perform computations, and how those computations relate to human behavior. The primate visual system offers a unique opportunity to characterize how an important quantity, visual motion, is represented in sensory input neurons, and relate this representation to how cortical neurons process visual motion, in a way that is not possible in any other animal model. To achieve this goal requires an interdisciplinary approach merging large-scale physiological recordings in order to simultaneously acquire signals from thousands of sensory neurons, and novel hardware and software to understand how these neural signals combine to represent motion in the brain.
Today, we can concurrently record the activity of ~200 primate “parasol” retinal ganglion cells, whose outputs are used to sense motion in the brain. However, this number is too small to understand fully the neural computations that subserve motion sensing. We will therefore develop the technology required to scale the recording of the sensory representation of moving visual stimuli to thousands of parasol cells, using novel hardware with tens of thousands of extracellular electrodes. We will record how parasol cells encode moving visual stimuli, and develop algorithms to decode these motion signals. We will test the hypothesis that cortical neurons efficiently extract information about motion from their inputs, by comparing the responses of cortical neurons measured in preceding studies with the output of efficient algorithms designed to extract information from the collection of sensory cells we record.
This will provide the first complete characterization of how a behaviorally significant neural computation is performed by large ensembles of sensory neurons.