Wu Tsai Neuro's weekly seminar series is being held virtually during the spring quarter. We hope to be able to bring the community together for in-person seminars again in the fall.
Community members interested in meeting with this week's speaker should contact host Mari Sosa of the Giocomo Lab.
Co-hosted by Stanford Department of Psychiatry and Behavioral Sciences.
Assistant Professor of Psychiatry, Neurosurgery, Psychology and Bioengineering
University of California, Los Angeles
Nanthia Suthana, Ph.D., is an Assistant Professor of Psychiatry, Neurosurgery, Psychology and Bioengineering at UCLA. She completed a B.S. and Ph.D. in Neuroscience and postdoctoral training at UCLA before joining faculty. She uses wearable technologies in patients with deep brain electrodes to understand cognitive functions such as learning and memory, and to develop therapies for patients with brain disorders. She has been awarded several NIH grants, a McKnight Technological Innovations in Neuroscience Award, and a Keck Junior Faculty Award for her work. She is currently the Ruth and Raymond Stotter Chair in Neurosurgery, serves as the Associate Director of the Neuromodulation Division at the Semel Institute for Neuroscience, and co-directs the UCLA Training program in Translational Neurotechnology. She has also been awarded a UCLA postdoctoral mentoring award and contributes to efforts aimed at increasing diversity, educational and career opportunities for women and underrepresented minorities in her role as the Associate Director of Neuroscience outreach for the Brain Research Institute at UCLA.
Intracranial neurophysiological representations of space and memory in freely-moving humans
Little is known of the underlying mechanisms in the human brain that allow one to keep track of their location while freely moving or that of others in a shared environment. I will present on findings from our research platform that allows for wireless recording of deep brain activity in human participants that are able to freely maneuver while immersed in real or virtual spatial environments. We find that specific patterns of human medial temporal lobe oscillatory activity are modulated by eye movements, position, walking direction, and the position/walking direction of another person in a spatial environment, some of which are further dependent on momentary task goal and/or memory. Lastly, I will present updates on our ongoing aim to use mobile deep brain recording in multiple interacting participants navigating augmented or real-world environments.
Hosted by Mari Sosa (Giocomo Lab)
Co-hosted by Stanford Department of Psychiatry and Behavioral Sciences