Brain-Robot Interfaces
using brain-robot interfaces for controlling implicit social patterns

Several emerging computer devices read bio-electrical signals (e.g., electro-corticographic signals, skin biopotential or facial muscle tension) and translate them into computer-understandable input. We investigated how one low-cost commercially-available device could be used to control a domestic robot. First, we used the device to issue direct motion commands; while we could control the device somewhat, it proved difficult to do reliably. Second, we interpreted one class of signals as suggestive of emotional stress, and used that as an emotional parameter to influence (but not directly control) robot behaviour. In this case, the robot would react to human stress by staying out of the person's way. Our work suggests that affecting behaviur may be a reasonable way to leverage such devices.

Researchers

Paul Saulnier(MSc)
Ehud Sharlin (Supervisor)
Saul Greenberg (Co-Supervisor)

Key Publications