Multimodal and mobile conversational Health and Fitness Companions

Multimodal conversational spoken dialogues using physical and virtual agents provide a potential interface to motivate and support users in the domain of health and fitness. This paper describes how such multimodal conversational Companions can be implemented to support their owners in various pervasive and mobile settings. We present concrete system architectures, virtual, physical and mobile multimodal interfaces, and interaction management techniques for such Companions. In particular how knowledge representation and separation of low-level interaction modelling from high-level reasoning at the domain level makes it possible to implement distributed, but still coherent, interaction with Companions. The distribution is enabled by using a dialogue plan to communicate information from domain level planner to dialogue management and from there to a separate mobile interface. The model enables each part of the system to handle the same information from its own perspective without containing overlapping logic, and makes it possible to separate task-specific and conversational dialogue management from each other. In addition to technical descriptions, results from the first evaluations of the Companions interfaces are presented.

[1]  Marilyn A. Walker,et al.  PARADISE: A Framework for Evaluating Spoken Dialogue Agents , 1997, ACL.

[2]  Li Zhang,et al.  Physically embodied conversational agents as health and fitness companions , 2008, INTERSPEECH.

[3]  Mark G. Core,et al.  Coding Dialogs with the DAMSL Annotation Scheme , 1997 .

[4]  Staffan Larsson,et al.  Information state and dialogue management in the TRINDI dialogue move engine toolkit , 2000, Natural Language Engineering.

[5]  Louisa Sadler,et al.  Structural Non-Correspondence in Translation , 1991, EACL.

[6]  Roeland Ordelman,et al.  Proceedings of the 8th European Conference on Speech Communication and Technology , 2003 .

[7]  María del Carmen Rodríguez Gancedo,et al.  ECA gesture strategies for robust SLDSs , 2008 .

[8]  Markku Turunen,et al.  A Mobile Health and Fitness Companion Demonstrator , 2009, EACL.

[9]  Chris Schmandt,et al.  Physical embodiments for mobile communication agents , 2005, UIST '05.

[10]  Luis Hernández,et al.  Design and Validation of ECA Gestures to Improve Dialogue System Robustness , 2007 .

[11]  Markku Turunen,et al.  Jaspis - A Spoken Dialogue Architecture and its Applications , 2004 .

[12]  D. Benyon,et al.  Evaluating Human-Computer Conversation in Companions , 2008 .

[13]  Arne Jönsson,et al.  A Natural Language Shell and Tools for Customizing the Dialogue in Natural Language Interfaces , 2007 .

[14]  Niels Ole Bernsen,et al.  Evaluation and usability of multimodal spoken language dialogue systems , 2004, Speech Commun..

[15]  Markku Turunen,et al.  Visualization of spoken dialogue systems for demonstration, debugging and tutoring , 2005, INTERSPEECH.

[16]  Preben Hansen,et al.  Multimodal agent interfaces and system architectures for health and fitness companions , 2008 .

[17]  Timothy W. Bickmore,et al.  Establishing and maintaining long-term human-computer relationships , 2005, TCHI.

[18]  Rodrigo de Oliveira,et al.  TripleBeat: enhancing exercise performance with persuasion , 2008, Mobile HCI.

[19]  Wayne H. Ward,et al.  The CU communicator: an architecture for dialogue systems , 2000, INTERSPEECH.

[20]  Luca Chittaro,et al.  MOPET: A context-aware and user-adaptive wearable system for fitness training , 2008, Artif. Intell. Medicine.

[21]  Li Zhang,et al.  A 'companion' ECA with planning and activity modelling , 2008, AAMAS.

[22]  Claude Sammut,et al.  InCA: A Mobile Conversational Agent , 2004, PRICAI.

[23]  Nuria Oliver,et al.  MPTrain: a mobile, music and physiology-based personal trainer , 2006, Mobile HCI.

[24]  Li Zhang,et al.  Integrating Planning and Dialogue in a Lifestyle Agent , 2008, IVA.

[25]  Michael F. McTear,et al.  The queen's communicator: an object-oriented dialogue manager , 2003, INTERSPEECH.

[26]  Simon King,et al.  V Jornadas en Tecnologia del Habla , 2008 .

[27]  Eduardo López Gonzalo,et al.  A GOOD GESTURE: EXPLORING NONVERBAL COMMUNICATION FOR ROBUST SLDSs , 2006 .

[28]  Markku Turunen,et al.  An architecture and applications for speech-based accessibility systems , 2005, IBM Syst. J..

[29]  Markku Turunen,et al.  Flexible dialogue management using distributed and dynamic dialogue control , 2004, INTERSPEECH.

[30]  David Pardo,et al.  Modular definition of multimodal ECA communication acts to improve dialogue robustness and depth of intention , 2008 .

[31]  Markku Turunen,et al.  A Speech-based and Auditory Ubiquitous Office Environment , 2005 .

[32]  Jean-Charles Marty,et al.  Plan Revision In Person-Machine Dialogue , 1989, EACL.

[33]  Yorick Wilks,et al.  Is There Progress on Talking Sensibly to Machines? , 2007, Science.

[34]  Preben Hansen,et al.  A mobile fitness companion , 2008 .