Personalized spatially-aware dialogue systems
A project exploring the intersection between spoken dialogue systems, geographic databases and crowd-sourcing.
The goal of this research is to develop methods for robust spoken natural-language interaction between humans and machines in a
rapidly changing spatial context. We are considering the scenario of a spoken dialogue system helping pedestrians in a city; people
who are in need of information about directions or services in the immediate vicinity, and who are carrying a mobile phone with a
GPS receiver. The system, being aware of the user's location, pushes information and answer questions in an ongoing
natural-language dialogue with the user, taking the spatial context into consideration. For instance, the user might need directions ("I
want to go to Odenplan" or "I need to find a pharmacy"), and the system computes a route from the user's current location to the
desired goal, and then gives directions as the user walks along. A key challenge in this research is to devise robust and personalized
methods to generate and understand utterances involving geographical and spatial references, based on contextual factors such as
the user´s current speed, direction, personal preferences, the preceding dialogue, and the geographic vicinity. Another key problem
in the proposed project is how to let users add geographic information to the database using speech, resulting in better
crowd-sourced open geographic databases and even better pedestrian routing systems.
Johan Boye (Project leader)
Funding: VR (2013-4854)
Duration: 2014 - Ongoing