Event Details:
![A dark teal band across the top of the image has in bright green font "MBCT | Mind, Brain, Computation, and Technology | Seminar Series 2024-2025",Cynthia Moss smiles while holding a small brown bat perched on a yellow-gloved hand. The background is dimly lit with a purple and red glow, highlighting the bat and her expression of interest.](/sites/default/files/styles/scale_and_crop_75_68_sm_x2/public/2025-01/Cynthia%20Moss.png?itok=LBgd_csL)
Continue the conversation: Join the speaker for a complimentary dinner in the Theory Center (second floor of the neurosciences building) after the seminar
Neural codes for 3D auditory localization
Abstract
As humans and other animals move through the natural environment, the distance and direction to objects change, invoking the need to rapidly encode dynamic information about 3D natural scenes. Animals that rely on active sensing provide powerful research models to investigate the neural underpinnings of 3D scene representation, as they produce the very signals that yield sensory input, which then guide motor actions. Echolocating bats, for example, compute the direction of objects from differences in echo intensity, spectrum, and timing at the two ears, and they estimate their distance to objects from the time delay between sonar calls and echo returns. Further, the bat adapts its echolocation behavior in response to 3D spatial information computed from echo returns, and therefore, the directional aim and temporal patterning of the bat’s calls provide a window to its attention to objects in the environment. This talk will summarize three new findings on neural mechanisms of 3D auditory localization in the big brown bat, Eptesicus fuscus. 1) Echo-delay tuned neurons in the midbrain of the freely flying bat show 3D spatial tuning to echoes from physical objects, and sonar-guided attention evokes sharper echo delay tuning and shifts to shorter echo delays. 2) Local field potential recordings from auditory midbrain neurons in the passively listening bat encode the time interval between call-echo pairs in the microsecond range, which aligns with high resolution behavioral performance data. 3) A population of hippocampal CA1 neurons encodes the distance of auditory objects, implicating this brain structure in time-space computations. Collectively, these findings serve to bridge results from behavioral and neurophysiological studies and demonstrate that 3D auditory spatial coding operates through networks of neurons across midbrain and telencephalic brain regions.
Cynthia Moss
John Hopkins University
Cynthia F. Moss is Professor of Psychological and Brain Sciences, with joint appointments in Neuroscience and Mechanical Engineering. At Johns Hopkins, she directs the Comparative Neural Systems and Behavior Laboratory, aka the Bat Lab. Moss received a B.S. (summa cum laude) from the University of Massachusetts, Amherst, and a Ph.D. from Brown University. She was a Postdoctoral Fellow at the University of Tübingen, Germany, and a Research Fellow at Brown University before joining the faculty at Harvard University. At Harvard, Moss received the Phi Beta Kappa teaching award and was named the Morris Kahn Associate Professor. She also received the National Science Foundation Young Investigator Award. She later moved to the University of Maryland, where she was a Professor in the Department of Psychology and Institute for Systems Research. At the University of Maryland, Moss served as Director of the interdepartmental graduate program in Neuroscience and Cognitive Science. She was recognized in 2010 with the University of Maryland Regents Faculty Award for Research and Creativity. In 2014, Moss joined the faculty at Johns Hopkins University, where she served as Director of the Behavioral Biology Program, 2015-2018, and Chair of the Department of Psychological and Brain Sciences, 2020-2023. She was a visiting scholar at Hong Kong University of Science and Technology, 2018-2019 and Phi Beta Kappa Visiting Scholar, 2023-2024. Her recent awards include the Hartmann Award in Auditory Neuroscience (2017), the James McKeen Cattell Award (2018) and the Alexander von Humboldt Research Prize (2019). She is a Fellow of the American Association for the Advancement of Science, the Acoustical Society of America and the International Society for Neuroethology.
Hosted by Yiqi Jiang (see profile below)
About the Mind, Brain, Computation, and Technology (MBCT) Seminar Series
The Stanford Center for Mind, Brain, Computation and Technology (MBCT) Seminars explore ways in which computational and technical approaches are being used to advance the frontiers of neuroscience.
The series features speakers from other institutions, Stanford faculty, and senior training program trainees. Seminars occur about every other week, and are held at 4:00 pm on Mondays at the Cynthia Fry Gunn Rotunda - Stanford Neurosciences E-241.
Questions? Contact neuroscience@stanford.edu
Sign up to hear about all our upcoming events