Attractor dynamics in networks with learning rules inferred from data - Nicolas Brunel

Event Details:

Monday, April 17, 2017
This Event Has Passed
Time
10:00am to 11:00am PDT
Location
Contacts
Taylor Miller
Event Sponsor
Stanford Neurosciences Institute
Add to calendar:
Image

Special Seminar Series in Theoretical/Computational Neuroscience

Attractor dynamics in networks with learning rules inferred from data Nicolas Brunel, PhD Professor, Departments of Statistics and Neurobiology University of Chicago

Host: Ivan Soltesz

Abstract: The attractor neural network (ANN) scenario is a popular scenario for memory storage in association cortex, but there is still a large gap between these models and experimental data. In particular, the distributions of the learned patterns and the learning rules are typically not constrained by data.  In primate IT cortex, the distribution of neuronal responses is close to lognormal, at odds with bimodal distributions of firing rates used in the vast majority of theoretical studies. Furthermore, we recently showed that differences between the statistics of responses to novel and familiar stimuli are consistent with a Hebbian learning rule whose dependence on post-synaptic firing rate is non-linear and dominated by depression. We investigated the dynamics of a network model in which both distributions of the learned patterns and the learning rules are inferred from data. Using both mean field theory and simulations, we show that this network exhibits attractor dynamics. Furthermore, we show that the storage capacity of networks with learning rules inferred from data is close to the optimal capacity, in the space of unsupervised Hebbian rules. These networks lead to unimodal distributions of firing rates during the delay period, consistent with data from delay match to sample experiments. Finally, we show there is a transition to a chaotic phase at strong coupling strength, with a extensive number of chaotic attractor states correlated with the stored patterns.