Data-driven Modelling of Interaction Skills for Social Robots

A project aiming to use robot-mediated human interaction as a means of collecting data for modelling social signals in human robot interaction

This project aims to investigate fundamentals of situated and collaborative multi-party interaction and collect the data and knowledge required to build social robots that are able to handle collaborative attention and co-present interaction. In the project we will employ state-of-the art motion- and gaze tracking on a large scale as the basis for modelling and implementing critical non-verbal behaviours such as joint attention, mutual gaze and backchannels in situated human-robot collaborative interaction, in a fluent, adaptive and context sensitive way.


Jonas Beskow (Project leader)
Hedvig Kjellström
Samer Al Moubayed

Funding: SRA/KTH

Duration: 2016 - Ongoing

Related publications:

Published by: TMH, Speech, Music and Hearing

Last updated: 2012-11-09