REAL: Situated dialogues in instrumented environments

We give a survey of the research project REAL, where we investigate how a system can proactively assist its user in solving different tasks in an instrumented environment by sensing implicit interaction and utilising distributed presentation media. First we introduce the architecture of our instrumented environment, which uses a blackboard to coordinate the components of the environment, such as the sensing and positioning services and interaction devices. A ubiquitous user model provides contextual information on the users characteristics, actions and locations. The user may access and control their prole via a web interface. In the following, we present two mobile applications to employ the environmental support for situated dialogues, a shopping assistant and a pedestrian navigation system. Both applications allow for multi-modal interaction through a combination of speech, gesture and sensed actions such as motion.

[1]  Wolfgang Wahlster,et al.  A resource-adaptive mobile navigation system , 2002, IUI '02.

[2]  Dominik Heckmann,et al.  Introducing ”Situational Statements” as an integrating Data Structure for User Modeling, Context-Awareness and Resource-Adaptive Computing , 2003 .

[3]  Antonio Krüger,et al.  Robust speech interaction in a mobile environment through the use of multiple and different media input types , 2003, INTERSPEECH.

[4]  Peter Tandler,et al.  The BEACH application model and software framework for synchronous collaboration in ubiquitous computing environments , 2004, J. Syst. Softw..

[5]  Boris Brandherm,et al.  Adapting Spoken and Visual Output for a Pedestrian Navigation System, based on given Situational Statements , 2003 .

[6]  Michael Kruppa The better remote control - Multiuser interaction with public displays , 2004 .

[7]  Daniel Fitton,et al.  Out to lunch: exploring the sharing of personal context through office door displays , 2003 .

[8]  Gregory D. Abowd,et al.  The context toolkit: aiding the development of context-enabled applications , 1999, CHI '99.

[9]  Peter Tandler Synchronous Collaboration in Ubiquitous Computing Environments , 2004 .

[10]  Wolfgang Wahlster,et al.  Towards Symmetric Multimodality: Fusion and Fission of Speech, Gesture, and Facial Expression , 2003, KI.

[11]  Michael Schneider A Smart Shopping Assistant utilising Adaptive Plan Recognition , 2003 .

[12]  Andreas Butz,et al.  The connected user interface: realizing a personal situated navigation service , 2004, IUI '04.

[13]  Antonio Krüger,et al.  A User Modeling Markup Language (UserML) for Ubiquitous Computing , 2003, User Modeling.

[14]  A. Fox,et al.  Integrating information appliances into an interactive workspace , 2000, IEEE Computer Graphics and Applications.

[15]  Christian Kray,et al.  Situated interaction on spatial topics , 2003 .

[16]  Antonio Krüger,et al.  The museum visit: generating seamless personalized presentations on multiple devices , 2004, IUI '04.

[17]  Gerd Kortuem,et al.  Concepts and issues in interfaces for multiple users and multiple devices , 2004 .