Skip to content Skip to navigation

Mind, Brain, Computation and Technology graduate training seminar - Brian Hsueh and Yuan Chang Leong

Brian Hsueh, Center for Mind Brain Computation and Technology
January 14, 2019 - 5:10pm to 6:30pm
Sloan Hall, Math Bldg 380, Room 380-C

Activity-based projection mapping to identify cortical representations of internal states

Brian Hsueh

Mind, Brain, Computation and Technology graduate trainee, Stanford University

Abstract

How cortical ensembles encode internal homeostatic states and coordinate with other structures throughout the brain to drive adaptive behavior remains a fundamental question in neurobiology. To achieve access to such cortical populations, we developed an expanded viral toolkit for permanent genetic labeling of activity-defined neuronal populations both for unbiased, whole-brain labeling, and for targeted, projection-specific labeling. In conjunction with a computational framework enabling automated quantification of cortical projections in atlas-registered, cleared whole mouse brains, we performed a brain-wide screen for projections activated by hunger, and found increased activation of a projection from posterior insular cortex to the amygdala. Optogenetic stimulation of this projection was sufficient to increase food consumption in otherwise sated animals without affecting water consumption, and elicit aversion of a neutral environment, while optogenetic inhibition suppressed consumption in fasted animals. These findings demonstrate the utility of this new viral approach for activity-dependent input mapping, and support the role of posterior insula in top-down coordination of responses to perturbed homeostatic states.

Curriculum Vitae

Related papers

[1] Hsueh B, Burns VM, Pauerstein P, Holzem K, Ye L, Engberg K, Wang AC, Gu X, Chakravarthy H, Arda HE, Charville G, Vogel H, Efimov IR, Kim S, Deisseroth K. (2017). Pathways to clinical CLARITY: volumetric analysis of irregular, soft, and heterogeneous tissues in development and disease. Scientific Reports. 7(1):5899. doi: 10.1038/s41598-017-05614-4.

Motivated perception: How the brain sees what it wants to see

Yuan Chang Leong

Yuan Chang Leong, Center for Mind Brain Computation and Technology

Mind, Brain, Computation and Technology graduate trainee, Stanford University

 

Abstract

People tend to believe their perceptions are veridical representations of the world, but also commonly report perceiving what they want to see or hear. Do desires and wants alter perceptual experience, or do they merely bias subjective reports? In this talk, I will present converging evidence from computational modeling and functional neuroimaging indicating that motivational influences on perception reflect dissociable perceptual and response components. My talk will examine the role of the reward circuitry in biasing perceptual processes, and provide a computational description of how the drive for reward can lead to inaccurate representations of the world. If time permits, I will present recent work investigating motivational biases in the processing of naturalistic audio-visual stimuli.

Curriculum Vitae

Related papers

[1] Leong, Y. C., Hughes, B. L., Wang, Y., & Zaki, J. (2018). Neurocomputational mechanisms underlying motivated seeing. bioRxiv, 364836. doi: 10.1101/364836.
[2] Leong, Y. C., Radulescu, A., Daniel, R., DeWoskin, V., & Niv, Y. (2017). Dynamic interaction between reinforcement learning and attention in multidimensional environments. Neuron, 93(2), 451-463. doi: 10.1016/j.neuron.2016.12.040.