Skip to content Skip to navigation

Novel haptic interfaces for studying human perception in virtual environments

Stanford Neurosciences Institute, EPFL, Jacob Suchski
Virtual environments with haptic (force and tactile) feedback provide a unique opportunity to study human perception and sensory mechanisms using controlled stimuli and behavioral measurements. Virtual reality (VR) allows us to change the properties of physical elements in a virtual world and investigate human perception in environments that either mimic the real world or are not physically possible in reality. Leveraging this ability, we can build upon our understanding of how humans perceive objects in the real world as well as further the development of immersive virtual environments for education and training.

During this project, we wish to study the perception of the center of mass of virtual objects by improving our current haptic devices to include a twisting or “roll” degree of freedom. By collaborating with world leaders in origami robots at the Reconfigurable Robotics Lab (RRL) at EPFL (PI Jamie Paik), we believe we can build a novel, light weight haptic device that enables us to study the perception of the center of mass of virtual objects.

 

Participants

Lead Researcher(s): 

Advisor: Allison M. Okamura (mechanical engineering)

EPFL exchange host: Jamie Paik, Lab Director - Reconfigurable Robotics Lab, Ecole polytechnique fédérale de Lausanne (EPFL), Institute of Mechanical Engineering, School of Engineering

Funding Type: 
EPFL-Stanford Exchange
Award Year: 
2018