Embodied Approaches to Interpersonal Coordination: Infants, Adults, Robots, and Agents

Embodied Approaches to Interpersonal Coordination: Infants, Adults, Robots, and Agents Rick Dale (rdale@ucmerced.edu) Cognitive and Information Sciences University of California, Merced Merced, CA 95343 USA Chen Yu (chenyu@indiana.edu) Department of Psychological and Brain Sciences Indiana University Bloomington, IN 47401 USA Yukie Nagai (yukie@ams.eng.osaka-u.ac.jp) Department of Adaptive Machine Systems Osaka University 2-1 Yamada-oka, Suita, Osaka, 565-0871 Japan Moreno Coco (mcoco@staffmail.ed.ac.uk) Institute of Language, Cognition, and Computation University of Edinburgh 10 Crichton Street, Edinburgh EH8 9AB UK Stefan Kopp (skopp@techfak.uni-bielefeld.de) Sociable Agents Group, Cognitive Interaction Technology (CITEC) Technische Fakultat, Universitat Bielefeld Morgenbreede 39 33615 Bielefeld, Germany Keywords: human interaction; language learning; human-agent interaction; dynamics; robotics. semi-automatic extraction event alignment and exploration Workshop Background and Relevance Humans interact with other humans. They do so frequently, in a wide variety of circumstances, to accomplish many different goals. This interpersonal interaction, especially in face-to-face circumstances, requires coordination (Clark, 1996). This involves many subtle behaviors, controlled carefully in the context of another person, from eye movements and gestures, to choice of words. The characteristics of the cognitive system that give way to this coordination have been a matter of debate recently in the cognitive sciences. Yet there remain many open questions about how the cognitive system functions in human interactions. How does interpersonal coordination emerge in the dyad? What behaviors are coordinated between persons, and in what manner? How can we model dyads and their interactions? One challenge to advance our understanding of how human participants utilize social-cognitive cues in everyday communication is that the empirical evidence is based on macro-level behaviors in constrained unnatural contexts and tasks. To truly understand mechanisms of interpersonal coordination, however, we may need to focus on more micro-level behaviors as they unfold in real time, and in free flow interaction, for example, changes in eye gaze and shifts in body position as they are linked to objects, events, and actions of the social partner. Several new directions have pursued this microstructure of interpersonal interaction. First, with advances in sensing and computing techniques, now we have the capabilities to process visual, audio and other sensory data collected from real-world interactions. This data-intensive approach provides a unique opportunity for new discoveries from various advanced data analysis techniques. These methods have leveraged visualization techniques to mine the temporal relationships between behaviors of two people (Yu et al. , 2009; see Fig. 1). This has shed light on the timing of interpersonal interaction, and how two individuals adapt to each other, both in infant-adult dyads (e.g., Smith et al., 2010; Yu & Figure 1: Visualization software for extracting, aligning and mining large multivariate time series of behaviors to uncover coordination (adapted from Yu et al., 2009). Smith, 2012; Nagai et al., 2012), and in two adults (Coco et al., 2012; Richardson & Dale, 2005). Second, researchers in developmental robotics have investigated mechanisms of interpersonal coordination, to model and implement social systems. In developmental robotics, recent progress has been achieved in developing robots that elicit human scaffolding (Nagai, Nakatani, & Asada, 2010). This progress has been possible by implementing underlying processes that could be involved in the dynamic control of interpersonal coordination. For example, implementing a model of a mirror neuron system can help basic skills in robots like self-other discrimination, and can support more complex abilities, such as imitation (Nagai et al., 2011; see Fig. 2, left). By grounding high- level theories into robotic systems, we can address different aspects of how social-cognitive capabilities, such as gaze following and face preference, can be learned through sensorimotor interactions. Third, research on virtual agents has developed new models of embodied human-agent interaction. This offers new ways to explore processes of interpersonal coordination. This has included, for example, the role of gesture and nonverbal behavior (Sadeghipour & Kopp, 2011), attentive speaking (Buschmeier & Kopp, 2011), and feedback (Kopp et al., 2008). Virtual embodied agents provide a foundation for testing theories of adult-adult interaction, and developing exciting social tools to support interpersonal coordination (see Fig. 2, right).

[1]  Stefan Kopp,et al.  Embodied Gesture Processing: Motor-based Perception-Action Integration in Social Artificial Agents , 2011 .

[2]  Chen Yu,et al.  Embodied attention and word learning by toddlers , 2012, Cognition.

[3]  Minoru Asada,et al.  Emergence of mirror neuron system: Immature vision leads to self-other correspondence , 2011, 2011 IEEE International Conference on Development and Learning (ICDL).

[4]  Stefan Kopp,et al.  Towards Conversational Agents That Attend to and Adapt to Communicative User Feedback , 2011, IVA.

[5]  付伶俐 打磨Using Language,倡导新理念 , 2014 .

[6]  Chen Yu,et al.  Visual Data Mining of Multimedia Data for Social and Behavioral Studies , 2009, Inf. Vis..

[7]  Daniel C. Richardson,et al.  Looking To Understand: The Coupling Between Speakers' and Listeners' Eye Movements and Its Relationship to Discourse Comprehension , 2005, Cogn. Sci..

[8]  Stefan Kopp,et al.  Modeling Embodied Feedback with Virtual Humans , 2006, ZiF Workshop.

[9]  Moreno I. Coco,et al.  Cognitive Dynamics of Alignment in Dialogue Games , 2012 .

[10]  Linda B. Smith,et al.  Not your mother's view: the dynamics of toddler visual experience. , 2011, Developmental science.

[11]  Minoru Asada,et al.  How a Robot’s Attention Shapes The Way People Teach , 2010, EpiRob.

[12]  Minoru Asada,et al.  Co-development of information transfer within and between infant and caregiver , 2012, 2012 IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL).