Medical 'mixed reality' applications take center stage at open house event

Image

By Nicholas Weiler

From teaching the fine art of the spinal tap to gamifying at-home physical therapy for stroke survivors, creative uses of virtual and augmented reality technology in medicine were on display at an open-house held in December at the Wu Tsai Neurosciences Institute.

Many of the projects on display were developed by students in the Fall Quarter “Mixed Reality in Medicine” course, hosted by Wu Tsai Neuro’s Visualization Laboratory, which opened in 2021 as a community hub for researchers across campus to explore and develop new applications for these emerging technologies. The course was led by Visualization Lab co-director Christoph Leuze and radiology professor Bruce Daniel.

Leuze became interested in the possibilities of mixed reality technologies for visualizing MRI data during his time as a Wu Tsai Neuro interdisciplinary postdoctoral scholar working with Jennifer McNab, an associate professor of radiology who is now co-director of the Visualization lab.

Wu Tsai Neuro Visualization Lab co-director Christoph Leuze
Wu Tsai Neuro Visualization Lab co-director Christoph Leuze (Image credit: Nicholas Weiler)

“Mixed reality technologies are maturing to the point where they can produce really useful applications, and medicine is one of the first fields that’s going to benefit from that,” Leuze said. “We’ve seen huge demand from researchers and clinicians to get involved in mixed reality applications, but there has been a barrier because not many people have the technical know-how to get started. This course fills that gap by not only teaching students about potential applications of mixed reality technologies, but how to design and implement projects relevant to their own research interests.”

The open house was held following a panel discussion on mixed reality for medical training, hosted by Stanford Medical Mixed Reality (SMMR), a group that Leuze leads as executive director.

Learn more about the Visualization Lab at the Wu Tsai Neurosciences Institute

A new adaptive optics rig for live cellular-resolution retinal imaging is under construction in the lab and is expected to launch in 2022.

 

“The motivation behind SMMR and the Wu Tsai Neuro Visualization Lab is to bring together all the groups at Stanford working on medical mixed reality and to invite the broader Stanford community to explore what’s possible with these technologies,” Leuze said. “Mixed reality is not just the metaverse. There are useful, practical applications of VR and AR technology that have the potential to advance many people’s research. This is one of the most fun technologies out there right now, so we want to encourage people to just come by and try it out.”

Take a tour of the SMMR Open House or check out a few of the student projects on display below:



Image credit: Nicholas Weiler

Mechanical engineering PhD student Jasmin Palmer, a member of Allison Okamura’s Collaborative Haptics and Robotics in Medicine (CHARM) Lab, incorporated an augmented reality headset and haptic feedback technology to develop a virtual lumbar puncture (spinal tap) training program. Users could not only visualize themselves performing the procedure, but also feel the changing physical pressure that guides experienced practitioners to get the needle in just the right spot. 

“It’s a very haptic procedure, involving the physician palpating the spine with their hands and then feeling the pop to know they’ve gotten the needle through to the spinal cord properly,” Palmer said. “It seemed like a good procedure to try to simulate, especially because the radiologist I consulted with said there’s a real shortage of clinical dummies to help medical trainees learn the procedure.”

Watch Palmer discuss her work



Image credit: Nicholas Weiler
 

Laura Schütz, a masters student in design impact in the School of Engineering, is also working to make augmented reality more immersive — in her case by incorporating the exquisite human sense of hearing.

One of the signature projects being developed in the Visualization Lab is an augmented reality system to help researchers precisely and accurately target non-invasive transcranial magnetic brain stimulation (TMS) in research subjects — work being done in the Koret Human Neurosciences Community Laboratory next door in the Stanford Neurosciences Building. The application lets a researcher wearing AR goggles see MRI imagery of a subject’s brain overlaid on their head, to help target a specific brain region of interest with the TMS “wand”. 

Schütz is collaborating with Leuze to take this work a step further by incorporating sound — letting researchers hear as well as see when they have reached the right spot. “We often rely too heavily on our visual systems, but I’m interested in exploring ways we can reduce the cognitive demands of complex clinical procedures by moving some of the load to other sensory systems,” Schütz said.



Image credit: Nicholas Weiler

Godson Osele, another mechanical engineering PhD student in the CHARM lab, designed a system to let people with mobility disabilities control their computer cursor with their eyes using a mixed reality headset. Osele, who has a background in biomedical engineering, was inspired by shadowing a Stanford radiologist with movement disabilities and seeing how challenging regular tasks like typing, using a mouse, and navigating medical software could become for someone with limited mobility over a long clinical work day.

Osele’s program uses the mixed reality headset’s built-in eye tracking capabilities to let users sit back and control their computers without requiring extensive physical mobility, and he hopes to incorporate additional features such as basic voice commands and signals using blinking to enhance its functionality.

“Often solutions for people with disabilities require designing very custom systems, but this, you just have to put on the headset and set the eye calibration once, and you’re good to go,” Osele said. “This felt like something that could help a huge amount people with very little infrastructure.”

Watch Osele discuss his work



Image credit: Nicholas Weiler

Computer science major Rachel Naidich (‘23) became interested in physical therapy through personal experiences as an athlete. As she learned more about the subject, she discovered that the biggest obstacles people face in regaining physical function is often not the therapy itself, but getting people to adhere to their therapy regimen faithfully over the necessary weeks or months.

In part, Naidich said, the therapies are just boring — like having to close and open your hand a hundred times. So she developed an augmented reality demo in which users can play a virtual turtle-smashing game by flicking at virtual turtles with a movement of the fingers that perfectly mimics a key physical therapy exercise. 

“If a physical therapist asks you to close and extend your hand over and over for 30 minutes or an hour every day, it becomes really boring, and people are going to stop doing it after a while,” Naidich said. “But I was hoping to create a way to make it fun and hopefully also provide feedback about how you’re doing to help guide the therapy.”

Watch Naidich discuss her work