Architecture of a Framework for Generic Assisting Conversational Agents

In this paper, we focus on the notion of Assisting Conversational Agents (ACA) that are embodied agents dedicated to the function of assistance for novice users of software components and/or web services. We discuss the main requirements of such agents and we emphasize the genericity issue arising in the dialogical part of such architectures. This prompts us to propose a mediator-based framework, using a dynamic symbolic representation of the runtime of the assisted components. Then we define three strategies for the development of the mediators that are validated by the implementation of experiences taken in various situations.

[1]  J. Cassell,et al.  Embodied conversational agents , 2000 .

[2]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents: Research Articles , 2004 .

[3]  Catherine Pelachaud Some considerations about embodied agents , 2000 .

[4]  Philip R. Cohen,et al.  Creating tangible interfaces by augmenting physical objects with multimodal language , 2001, IUI '01.

[5]  Michael Wooldridge,et al.  Reasoning about rational agents , 2000, Intelligent robots and autonomous agents.

[6]  Pattie Maes,et al.  Agents that reduce work and information overload , 1994, CACM.

[7]  Donna K. Byron,et al.  What ’ s a Reference Resolution Module to do ? Redefining the Role of Reference in Language Understanding Systems , 2002 .

[8]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[9]  Anand S. Rao,et al.  Modeling Rational Agents within a BDI-Architecture , 1997, KR.

[10]  Wolfgang Wahlster,et al.  Smartkom: multimodal communication with a life- like character , 2001, INTERSPEECH.

[11]  James F. Allen,et al.  TRIPS: An Integrated Intelligent Problem-Solving Assistant , 1998, AAAI/IAAI.

[12]  Buisine Stéphanie,et al.  Evaluation of Individual Multimodal Behavior of 2D Embodied Agents in Presentation Tasks , 2003 .

[13]  James F. Allen,et al.  Towards Conversational Human-Computer Interaction , 2000 .

[14]  James F. Allen,et al.  Toward Conversational Human-Computer Interaction , 2001, AI Mag..

[15]  Marie-Pierre Gervais,et al.  A Critical Analysis of MDA Standards through an Implementation : the ModFact Tool , 2004 .

[16]  James F. Allen,et al.  TRAINS-95: Towards a Mixed-Initiative Planning Assistant , 1996, AIPS.

[17]  K. Chang,et al.  Embodiment in conversational interfaces: Rea , 1999, CHI '99.

[18]  Judea Pearl,et al.  Reasoning with Cause and Effect , 1999, IJCAI.

[19]  Stefan Kopp,et al.  Model-based animation of co-verbal gesture , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[20]  Richard A. Bolt,et al.  Multi-modal natural dialogue , 1992, CHI '92.

[21]  J. Sadock Speech acts , 2007 .

[22]  Jean-Claude Martin,et al.  Algorithms for controlling cooperation between output modalities in 2D embodied conversational agents , 2003, ICMI '03.

[23]  David R. Traum,et al.  Fight, Flight, or Negotiate: Believable Strategies for Conversing Under Crisis , 2005, IVA.

[24]  Carlo Drioli,et al.  INTERFACE Toolkit: A New Tool for Building IVAs , 2005, IVA.

[25]  John F. Bradley,et al.  Maintaining the Identify of Dynamically Embodied Agents , 2017 .

[26]  James C. Lester,et al.  The persona effect: affective impact of animated pedagogical agents , 1997, CHI.

[27]  Jean-Claude Martin,et al.  Evaluation of Multimodal Behaviour of Embodied Agents , 2004, From Brows to Trust.