
https://stanford.zoom.us/j/95123591087?pwd=WlZmajA5c3pVYmIyVlhyN0NHeXpLQT09
Learning fast and slow: computation, plasticity and metaplasticity in a neural circuit
Thesis Defense: Brandon Jay Bhasin
Advisors: Jennifer Raymond (Neurobiology) & Mark Goldman (UC Davis Neuroscience)
Abstract
Memory—the ability to transform transient information into a more persistent form—is a fundamental property of neural systems, allowing them to learn from experience. Memories are stored in a variety of substrates, from persistent neural activity to changes in synaptic weights. Experience may even alter the interactions of the molecular networks that support synaptic weight changes themselves. The processes by which memories are stored, maintained and transformed are dynamical, resulting from the interplay of the substrate elements across spatial and temporal scales, fast and slow. Here I use mathematical and computational methods to investigate the dynamics of the storage of a long-term memory during synaptic plasticity, the transfer of long-term memory between synaptic sites during systems consolidation, and the tuning of the rules for inducing synaptic plasticity as a result of metaplasticity. I perform a conceptual unification of short-term and long-term memory, which have long been considered unrelated phenomena, by demonstrating that the consolidation of graded long-term memories in persistent synaptic weight changes is conceptually analogous to the storage of graded memories in persistent neural activity in short-term (working) memory, both processes being described by line attractor dynamics and the operation of temporal integration. Finally, I propose biologically plausible mechanisms for a newly discovered form of metaplasticity that each generate associative synaptic plasticity rules tuned to different features of the correlation structure of inputs to a neuron.