Script Language for Embodied Agents as Personal Conversational Media in Online Communities

In this paper, we propose a script language for embodied conversational agents that can function as personal conversational media in asynchronous community systems. Despite the fact that the community interaction is a social event, current online community systems are designed to focus mainly on information exchange through texts, and provide very little support for establishing and maintaining social relationships among participants. In order to enhance and exploit human prowess in social interaction, we have developed an asynchronous community system that employs embodied conversational agents (ECAs) as conversational media. An ECA, an animated character on a screen, can display various social expressions on behalf of the user. Therefore, we can construct and represent social interactive environment by a group of ECAs acting on a screen. This environment created by ECAs, in turn, induces social and psychological relationships between each ECA and the users. In this paper, we call such ECAs Personified Media (PM). We propose a script language for PM, PMScript, which enables users to specify and describe the behaviors, both expressive and interactive, of PM together with the handlings of other media contents. Participants in asynchronous community systems have sufficient time to compose a script description for their PM. In addition to the features of ECAs, the social presence of a PM can enhance users’ community awareness in terms of the human social environment. Therefore, conversations using PM should be smooth, expressive, informative, and social. By accumulating submitted scripts, PMScript can also serve as materials for further analysis and processing of actual participant behavior data in community interactions.

[1]  Mitsuru Ishizuka,et al.  MPML: A Multimodal Presentation Markup Language with Character Agent Control Functions , 2000, WebNet.

[2]  Norman I. Badler,et al.  A Virtual Human Presenter , 1997 .

[3]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[4]  Thomas Rist,et al.  Coping with Temporal Constraints in Multimedia Presentation Planning , 1996, AAAI/IAAI, Vol. 1.

[5]  Yugo Takeuchi,et al.  Social character design for animated agents , 1999, 8th IEEE International Workshop on Robot and Human Interaction. RO-MAN '99 (Cat. No.99TH8483).

[6]  A. Young,et al.  The faces that launched a thousand slips: everyday difficulties and errors in recognizing people. , 1985, British journal of psychology.

[7]  Hideaki Takeda,et al.  TelMeA: An Asynchronous Community System with Avatar-like Agents , 2001, INTERACT.

[8]  Barbara Hayes-Roth,et al.  Acting in Character , 2019, Creating Personalities for Synthetic Actors.

[9]  Camilla Gustavsson,et al.  Verification, Validation and Evaluation of the Virtual Human Markup Language (VHML) , 2002 .

[10]  James C. Lester,et al.  Deictic and emotive communication in animated pedagogical agents , 2001 .

[11]  Maarten van Dantzich,et al.  Lifelike computer characters: the persona project at Microsoft , 1997 .

[12]  Justine Cassell,et al.  Fully Embodied Conversational Avatars: Making Communicative Behaviors Autonomous , 1999, Autonomous Agents and Multi-Agent Systems.

[13]  Yugo Takeuchi,et al.  Change in human behaviors based on affiliation needs-toward the design of a social guide agent system , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).