01 January 2015 8 6K Report

It is possible to interface commercial off-the-shelf (COTS) EEG equipment with 3D virtual environments. In a student project I am supervising, we use the Emotiv Epoc EEG headset and interface it with the 3D game engine Unity. Now that we have a live communication channel established by means of UDP, there are virtually unlimited experiments that can be designed inside Unity. The EEG signals can be used to control objects in a 3D world (e.g., move a character), to affect screenplay based on emotion (e.g., when a user becomes bored as determined from EEG readings, give him a scare!), and so on. Both EEG and the 3D virtual environment may be further linked to real physical devices, such as a robot arm. Rehabilitation and training of both healthy and non-healthy participants is possible. Suitable experiments can be designed to aid in reverse-engineering the human brain and work out how the brain does things, such as sensing, planning, and control.

What do you think should be investigated?

http://www.emotiv.com

http://unity3d.com

Similar questions and discussions