Start ReadSpeaker XT


Seminar at Speech, Music and Hearing:

Can we teach/control a robot using brain signals?

Luis Montesano, University of Zaragoza


In this talk I will present some research on the use of brain activity from an EEG-based Brain Machine Interface to control and teach a robot. I will discuss two alternative and complementary directions: 1) direct control from kinematic information extracted from EEG measurements; and 2) task oriented control using error related potentials as feedback in a reinforcement learning framework. In the first case, control is implemented as a combination of motion intention detection and motion decodification. Motion intention can be extracted by a careful treatment of the signal using simple linear classifiers. I will also discuss recent results on the decoding of 3D arm trajectories from EEG measurements, which are still not robust enough for robot control. The second case is a new paradigm within the BMI-controlled devices. Current systems use pre-programmed devices and put much of the burden into the BMI. We propose an alternative approach where brain activity allows to close the loop and provide a feedback signal to the system. The feedback is created using error event related potentials, a type of synchronous activity that triggers in the presence of unexpected outcomes. We will show the first results on teaching a robotic arm how to perform a reach a position using brain signal based rewards.

15:15 - 17:00
Friday November 11, 2011

The seminar is held in Fantum.

| Show complete seminar list

Published by: TMH, Speech, Music and Hearing

Last updated: Wednesday, 23-Jun-2010 09:22:46 MEST