Virtual reality for behavioral change

Individualized closed-loop sensorized virtual reality for behavioral change

Prof. Yael Hanein & Dr. Tom Schonberg

Yael Hanein is a Professor of Electrical Engineering at Tel Aviv University. In the past she conducted research at the Weizmann Institute (MSc and PhD in Physics), Princeton University (visiting student at the lab of Nobel Prize Laureate Prof. Dan Tsui), and at the University of Washington (Postdoc in Electrical Engineering and Physics). Her research field is neuro-engineering, focusing on developing wearable electronic and bionic vision.

About The Project:

In the first 8 months of the project we were able to establish a proof-of-concept showing a high resolution, wireless surface electromyography (sEMG) recordings during VR engagement combined with eye-tracking. To the best of our knowledge, this is the first demonstration of a system capable of recording internal states during a VR task. Figure 1 shows the 16 channel wireless facial EMG with the VR headset on a participant face.

We show for the first time the ability to identify different expressions with the unique sEMG electrodes while wearing the VR headset. Figure 2 shows differential signals for different facial expressions during a calibration phase: closed smile, open smile, disgust and anger.

In the past 8 months we developed a dynamic VR task (Figure 3) using the Unity software. The VR task induces positive internal states (For example a cute creature, Figure 3A, 3B top) and allows to compare these responses to the response to neutral (geometric, Figures 3A, 3B bottom) figures in a controlled fashion. While participants explore a house using head movements within the VR task, animals and control objects appear in pre-defined random places.