Virtual Touch

Image
From Our Neurons to Yours Wu Tsai Neuro Podcast

Touch is one of the most important human senses. It lets us connect with the world – and each other. Roboticists like Allison Okamura – today’s podcast guest – think we should be building technology that helps us reconnect through the power of touch.

Listen to the full episode below, or SUBSCRIBE on Apple Podcasts, Spotify, Google Podcasts, Amazon Music or Stitcher. (More options)

 

View all episodes

Recently on the show, we had a conversation about the possibility of creating artificial vision with a bionic eye. Today we're going to talk about technology to enhance another sense, one that often goes underappreciated, our sense of touch.

We humans actually have one of the most sensitive senses of touch on the planet. Just in the tip of your fingers, there are thousands of tiny sensors, which scientists call mechanoreceptors that sense texture, vibration, pressure, even pain. Our sense of touch also lets us track how our bodies are moving in space. In fact, our refined sense of touch may be part of our success as a species. We humans use touch for everything. Building tools, writing, playing music, you name it. And on an emotional level, touch is fundamental to our social lives. Touch lets us connect with each other and the world around us.

But of course, we increasingly live in a technological world where we're often separated from the physical connections that are so important to us. Think about having a conversation on Zoom where you can't put your hand on a friend's arm to emphasize a point. Some scientists and engineers now think we should be building technology that reconnects us with the physical world rather than separating us from it. This is a growing area of research in robotics and virtual reality, a field called haptics.

That brings us to today's guest. Allison Okamura is Richard W. Weiland Professor in the Department of Mechanical Engineering at Stanford, and a deputy director of the Wu Tsai Neurosciences Institute. Her lab — the Collaborative Haptics and Robotics for Medicine (CHaRM) Lab — is dedicated to extending or augmenting the amazing human sense of touch through technology.

Learn more

Further Reading

Episode Credits

This episode was produced by Michael Osborne, with production assistance by Morgan Honaker, and hosted by Nicholas Weiler. Show art by Aimee Garza.

Episode Transcript

Nicholas Weiler:

This is From Our Neurons to Yours, a podcast from the Wu Tsai Neurosciences Institute at Stanford University. On this show, we crisscross scientific disciplines to bring you to the frontiers of brain science. I'm your host, Nicholas Weiler. Here's the sound we created to introduce today's episode.

Perhaps that is the sound of virtual touch. Recently on the show, we had a conversation about the possibility of creating artificial vision with a bionic eye. Today we're going to talk about technology to enhance another sense, one that often goes underappreciated, our sense of touch. We humans actually have one of the most sensitive senses of touch on the planet. Just in the tip of your fingers, there are thousands of tiny sensors, which scientists call mechanoreceptors that sense texture, vibration, pressure, even pain.

Our sense of touch also lets us track how our bodies are moving in space. In fact, our refined sense of touch may be part of our success as a species. We humans use touch for everything. Building tools, writing, playing music, you name it. And on an emotional level, touch is fundamental to our social lives. Touch lets us connect with each other and the world around us. But of course, we increasingly live in a technological world where we're often separated from the physical connections that are so important to us. Think about having a conversation on Zoom where you can't put your hand on a friend's arm to emphasize a point. Some scientists and engineers now think we should be building technology that reconnects us with the physical world rather than separating us from it. This is a growing area of research in robotics and virtual reality, a field called haptics. And that brings us to today's guest.

Allison Okamura:

I'm Allison Okamura. I'm a professor in the Department of Mechanical Engineering with a courtesy appointment in computer science.

Nicholas Weiler:

So Allison, you're a mechanical engineer and you run a robotics lab here at Stanford. But you're also deputy director of the Wu Tsai Neuroscience Institute. And listeners might wonder, "What's a robotics lab doing in a neuroscience institute?" And I wonder if you could address that a little bit for us. What is the connection between robotics and neuroscience?

Allison Okamura:

Thanks for asking that. I joke sometimes that I'm not a card carrying neuroscientist or that I should play one on TV. But I became really interested in neuroscience as it connects to the field of robotics and in particular human machine interaction, because robots and sometimes virtual experiences ideally have a physical component, that that sense of touch is critical for manipulation and the physical interaction. So I feel like you can't really work on robotics without being interested in human touch and the neuroscience that surrounds that.

Nicholas Weiler:

So your lab is developing a whole array of amazing technologies surrounding touch and robotics. And at the heart seems to be this idea of how to extend or augment this amazing human sense of touch through technology. Is that right? And is that what you mean by the term haptics?

Allison Okamura:

Absolutely. What we picture haptic technology is as ways to, in a way artificially stimulate the human body, in order for that person to feel like they're touching real objects or even unreal objects. And if we are successful, it's like a haptic Turing test. So the Turing test is famous in computer science. So how do you make a computer convince you that you're interacting or talking to a person instead of a computer? We would like to do a haptic Turing test where we convince a user that they're touching a real world object. If it's a surgeon maybe catching an organ. But in fact, they're getting this artificial feedback from one of our devices.

Nicholas Weiler:

That's such an interesting point, that your lab is trying to create these illusions of touching a real object. I've played around recently with some commercial VR tech, and I was impressed by just doing something simple like playing ping pong, like there's a little vibration in the handheld part of the device. And just a little something is enough to really immerse you in feeling like you're in a room playing ping pong.

Allison Okamura:

On one hand, a little bit of haptics, just like a vibration can go a long way, especially if you have that combined with other senses. But when we're thinking about training to do surgery or a surgeon teleoperating robot in order to do surgery, the types of interactions you have would say that the tissues of a patient or the artificial tissues, if it's a virtual environment, we need much more than simple vibrations. You want to feel forces, you want to feel the directions of those forces, and you want to feel those at different locations on the hand. For that to happen, you need to have very clever mechanical and electromechanical designs that create devices that stimulate the skin in just the right way.

Nicholas Weiler:

And so how are you going about that? What are some of the things that you're learning about the neuroscience of touch that will help to create these illusions?

Allison Okamura:

So we learn about the neuroscience of touch, say hand in hand, with the devices that we create. So for example, some of our first haptic devices, not just mine in my lab, but in the whole field, where these bulky desktop robots and you grabbed onto the hand at the end and it would apply forces to you. And then we realized that these are way too expensive and bulky to be really used by everyday people or even in training environments.

And so we started exploring the idea, as one example, of locally deforming the skin. Instead of having a whole robot arm that pushes on your hand and pushes on your skin as an external force, we instead locally have devices now that are wearable that can deform the skin. The neuroscience of this is that we needed to understand how [inaudible 00:06:50] that humans interpret this information. It's now just skin deformation, which used to be these large scale forces, understanding which mechanoreceptors in the skin are being stimulated and one of the right frequencies the device needs to accomplish in order to stimulate those, as well as some of the more cognitive aspects of how does someone interpret a local skin deformation as actually a large scale force, and how does that get combined with additional sensory information to make that feel realistic?

Nicholas Weiler:

I'm reminded of ... I think something that you told me previously when one of the last times we spoke about how there's this new insight that when you're picking up an object, a lot of the feeling of weight is not in your muscles or something as you might imagine, but actually in your fingertips.

Allison Okamura:

That's right. If the object is relatively lightweight, you really don't notice much in your muscles and in your joints. What you notice is the feedback from these very delicate mechanoreceptors embedded in the skin that can tell you something about the local texture, how much force you're applying, because your skin deforms a little bit as you touch it. And the different deformation patterns can tell you what directions, forces are pushing in, and with how much magnitude.

Nicholas Weiler:

So yeah, now every time I pick up a coffee cup, I'm actually paying attention to this and feeling how the tips of my fingers are shifting slightly as I'm picking it up. So you actually have devices now in the lab where you can start to simulate this and move the fingertips in a way that can start to simulate more complex objects?

Allison Okamura:

That's right. Over the years, we've tried to make smaller and smaller devices that can just be mounted on the fingertips so that you can move your fingers around, have them tracked in space, pick up virtual objects, and then appropriately stimulate the skin locally, so that it feels like you are manipulating real world objects. And now, we have approaches where we are 3D printing monolithically these soft materials into the right structures so that you can have a very lightweight device that just mounts on your fingertip and you put it on like you would put on a thimble.

Nicholas Weiler:

Oh wow. So paint me a picture of what are some of the applications that you see there? I mentioned virtual reality headsets. We think about VR as being very entertainment focused. But you're thinking about much more practical applications.

Allison Okamura:

That's right. So the application of this, beyond entertainment, a key one is training. How do you teach someone to do something very efficiently and safely in a virtual environment before they go and do it in the real world? And this can be from something as complicated as a surgery, where you would rather have someone practice in simulation before they practice on a real patient-

Nicholas Weiler:

Definitely.

Allison Okamura:

To more tasks like, "How do I fix my broken copy machine? What do you need to touch and what do you need to pull on? And can we guide someone through that process using haptic feedback so that they can do it instantaneously?"

Nicholas Weiler:

That does start to sound a little bit like, I don't know, the Star Trek Holodeck experience, where you can have this virtual world where you can train. One of the things I was struck by when looking over your website and some of your past research was the idea of not only working in a virtual world, but using these technologies to extend our ability to touch, through tools like robots. Is that also something where this technology has some direct applications?

Allison Okamura:

Absolutely. So with surgery, you can use virtual environments or augmented reality environments for training. But now, with the advent of robot assisted surgical systems, the surgeon becomes separated from the patient. And the way these robot assisted surgical systems work is that the surgeon is sitting at a console and manipulating what basically amounts to some fancy joysticks. And those joysticks send command to a patient side robot that is actually interacting with the patient's tissues. So in this teleoperated system, there is now this electronic barrier between the surgeon and the patient. So the goal of haptics in that kind of environment is to, in real time during the surgery, provide the clinician with haptic feedback, such that they feel the ideal illusion is that they feel as if their fingers are directly touching the patients and their tissues, rather than this whole thing being mediated by a bunch of metal and current and voltage. In teleoperated robot assisted surgery, there are going to many procedures where you need to have a sense of touch in order to apply the appropriate forces and deformations to the tissue, to accomplish the surgical procedure.

Nicholas Weiler:

So sticking with the medical vein, I wanted to ask about another technology that your lab has been developing, which is these wearable devices that help people recovering from stroke, if I understand correctly, relearn how to use the body parts that might be paralyzed or might be experiencing numbness as a result of the stroke. And it's just a remarkable to me that these devices that work by patterns of vibration can help with rehabilitation after a stroke. How does that work exactly? What is going on there?

Allison Okamura:

This is one of our most exciting projects. So our technology in this space is a vibrating glove that locally stimulates a number of things. Not just the skin, but also muscles and tendons that are underneath these small vibrating motors. The mechanism is not completely understood about why this vibration works, and that's the subject of ongoing neuroscience research. But our belief and our understanding so far is that when the brain becomes damaged after a stroke, you have to basically reconnect neural circuits that follow different paths from what might have been there originally. And this stimulation actually helps create these new pathways because it allows our stroke participants to feel sensations that normally they would have to move in order to feel them. But in this case, our device creates those sensations and helps stimulate recovery.

Nicholas Weiler:

Interesting. So it's kickstarting the communication between the sensory nerves and muscles and things in the body, and the brain that's trying to relearn essentially how to interpret and control those body parts.

Allison Okamura:

Exactly. After a stroke, a person often has trouble moving. And then if you don't move, you don't get that sensory feedback. And the thought is that just providing the sensory feedback helps encourage the regrowth of new neural pathways that helps these folks eventually recover their movement sensation control loop.

Nicholas Weiler:

That's amazing. It reminds me of this idea about using haptics to help train people in, say in a medical context, or in other contexts. There's so much that depends on the integration of our minds and our bodies and our senses, and it seems like really interesting direction. This may be a little bit of a silly question, but I wanted to ask. I feel like we often have this feeling of disconnection between our minds and our bodies, and people do yoga and people do other things to try to reconnect. And do you see philosophically or practically having this be part of the meaning of this research, to find ways of connecting brain and body?

Allison Okamura:

That's a great question. Being able to connect physical feeling and interactions to a mental state, whether that's just about wellbeing or about being able to control your movements as desired, I think this is a critical part of what makes us human. For me, as someone who develops haptic technology, part of it is also is making these devices be maybe less our masters and more our collaborators. Can the everyday technology that we interact with? Right now, it's Zoom calls and cell phones. But as these environments become more immersive and more virtual, we probably don't want to have this disconnect. The same disconnect we have with our bodies, it's going to be even worse with a virtual body. So how do we use haptics and other sensory modalities to make these experience more real and let us connect with remote loved ones, learn how to do new tasks as we were talking about earlier, and other things that we just want to achieve as people?

Nicholas Weiler:

Yeah, and what you said there reminds me of another project that your lab's been working on, not just to feel through these robots or stimulate therapeutically, but also to understand social touch and how we could use wearables to convey the kinds of touch that help people bond with each other. Could you tell us a little bit about this experiment where I think you called them haptic emojis. What is that and what did you do there?

Allison Okamura:

Yeah. Well, during the pandemic, I think a lot of us recognized that Zoom calls or phone calls were not enough to fully emotionally connect with others. So we have this project. The aim was to develop human-human interaction using artificial touch sensations. So what we did is we started by putting sensors on people's skin, basically sensor sleeves, and having pairs of people interact with each other while we recorded their touch interactions. We would say to someone, "Tell your friend here that you're happy through the sense of touch", and the first person might touch the other person on their arm and shake it a little bit to convey happiness. Or if it was a different signal like, "Pay attention to me", then the first person might poke and the person who was wearing the sensor sleeve. So we recorded thousands of thousands of data points about how people interact with the sense of touch to communicate social emotional cues.

Then we took wearable haptic devices that we similarly would place on the arm, and we used that data to recreate these touch sensations. So all this data went into forming what is the ideal way to convey happiness or sadness or love and attention, the pay attention to me emoji. So these wound up being captured emojis that rather being an image that you might send through a text message, now are a series of touches that would convey these pieces of information. So the advantage here is first of all, picking up again on some of the missing interactions that we have when we're remote. But also to create a different type of connection, one that might be more natural, than trying to replace touch with another sense like vision.

Nicholas Weiler:

I could just imagine a world where we spend so much time looking at screens these days. But having a way to all of these things together, and particularly this last one, these social touch emojis, it feels like it could bring a lot more humanity to our very digital lives.

Allison Okamura:

Exactly. I will say I always have a bit of a twinge as someone who develops technology. Does this mean it gives me permission to go on more trips away from my family because I could just really remotely interact with them? I always want to be a little careful about saying, "I don't know. No matter how good this technology gets, I don't think it'll really replace people being in person with each other." But when it's needed, I think it can provide a great benefit.

Nicholas Weiler:

Yeah, absolutely. To add more in places where we are missing some of that social content or contact rather. Well, there's so many more things I want to talk to you about. You've got this whole array of soft robots that can expand and contract and fit through halls to fight fires or conduct search and rescue and lots of other work on wearable devices that we didn't have time to get into. And I think we'll just have to come back and talk about those for a future episode. But I'd love to just say thank you so much for coming on the show. It was really fascinating to talk about these ideas.

Allison Okamura:

Thanks for having me. Lots of fun.

Nicholas Weiler:

Thanks so much again to our guest, Allison Okamura. For more info about her work, check out the links in the show notes. This episode was produced by Michael Osborne with Production Assistance by Morgan Honaker. I'm Nicholas Weiler. See you next time.