Story-telling and emotion : cognitive technology considerations in networking temporally and affectively grounded minds

As technological agents that we interact with take an increasing amount of autonomy and responsibility from us, acting on our behalf, there are prospects for social changes in our notions of responsibility for our own actions as well as for the development of ‘cognitive calluses’ in our interactions with such technologies. Other agents that we interact with assist us in such tasks as search, scheduling, organizing the information we access and other parts of our lives, so appropriate design considerations are desirable in order to respect human wholeness and build humanely interfaced social agents (Dautenhahn & Nehaniv 1999). The impact of technology on human modes of existence, experience, and relationships is changing who were are, and how we interact and relate with one another along a broad collection of dimensions that should be considered by the designers and users of technology (Gorayska et al. 1997). Yet most software agents today have no consciously designed affective communication skills or, if they do, often display inappropriate affect to the user and are unable to support the high ‘affective bandwidth’ present in human face-to-face (but not e-mail) communication (e.g. Picard 1997). Moreover, while human cognition may be fundamentally structured to deal with temporal grounding in terms of stories and narrativity (Schank & Abelson, 1977, 1995, Schank 1990, Read & Miller 1995), software agents today tend to lack any semblance of temporal grounding, but merely react to the user on the basis of no or very limited information of what has happened in the past (Dautenhahn & Nehaniv 1998) or behave in (e.g. strangely discontinuous) manner which is not believable to our human narrative intelligence (Sengers 1998, Sengers 1999).

[1]  Chrystopher L. Nehaniv Algebraic models for understanding: coordinate systems and cognitive empowerment , 1997, Proceedings Second International Conference on Cognitive Technology Humanizing the Information Age.

[2]  Henry Lieberman,et al.  Instructible Agents: Software that Just Keeps Getting Better , 1996, IBM Syst. J..

[3]  The First , Second and Third Person Emotions : Grounding Adaptation in a Biological and Social WorldChrystopher , 1998 .

[4]  Chrystopher L. Nehaniv Meaning for observers and agents , 1999, Proceedings of the 1999 IEEE International Symposium on Intelligent Control Intelligent Systems and Semiotics (Cat. No.99CH37014).

[5]  Kerstin Dautenhahn,et al.  Human Cognition and Social Agent Technology , 2000 .

[6]  M. Posner The Brain and Emotion , 1999, Nature Medicine.

[7]  R. Byrne,et al.  Machiavellian intelligence : social expertise and the evolution of intellect in monkeys, apes, and humans , 1990 .

[8]  Henry Lieberman,et al.  Watch what I do: programming by demonstration , 1993 .

[9]  R. Schank,et al.  Knowledge and Memory: The Real Story , 1995 .

[10]  J. Bruner The Narrative Construction of Reality , 1991, Critical Inquiry.

[11]  Kerstin Dautenhahn,et al.  Constructive biology and approaches to temporal grounding in postreactive robotics , 1999, Optics East.

[12]  Henry Lieberman,et al.  Let's browse: a collaborative browsing agent , 1999, Knowl. Based Syst..

[13]  Hong Hong,et al.  Human Cognition and Social Agent Technology , 2001, J. Educ. Technol. Soc..

[14]  Phoebe Sengers,et al.  Antiboxology: agent design in cultural context , 1998 .

[15]  M. Mateas,et al.  What is Narrative Intelligence ? , 1998 .

[16]  Roger C. Schank,et al.  Scripts, plans, goals and understanding: an inquiry into human knowledge structures , 1978 .

[17]  R. Schank Tell Me a Story: A New Look at Real and Artificial Memory , 1991 .

[18]  Aaron Sloman,et al.  Why Robots Will Have Emotions , 1981, IJCAI.

[19]  Kerstin Dautenhahn,et al.  I Could Be You: the Phenomenological Dimension of Social Understanding , 1997, Cybern. Syst..

[20]  M. Heidegger On time and being , 2008 .

[21]  Clark Elliott,et al.  Story-morphing in the affective reasoning paradigm: generating stories semi-automatically for use with “emotionally intelligent” multimedia agents , 1998, AGENTS '98.

[22]  B. Gorayska,et al.  Putting the horse before the cart: formulating and exploring methods for studying cognitive technology , 1997, Proceedings Second International Conference on Cognitive Technology Humanizing the Information Age.

[23]  James C. Lester,et al.  Increasing believability in animated pedagogical agents , 1997, AGENTS '97.

[24]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[25]  Henry Lieberman,et al.  Let's browse: a collaborative Web browsing agent , 1998, IUI '99.

[26]  K. Dautenhahn,et al.  16. Living with Socially Intelligent Agents: A Cognitive Technology view , 2000 .

[27]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[28]  Justine Cassell,et al.  Storytelling systems: constructing the innerface of the interface , 1997, Proceedings Second International Conference on Cognitive Technology Humanizing the Information Age.

[29]  P. Maes,et al.  Old tricks, new dogs: ethology and interactive creatures , 1997 .

[30]  Joseph Weizenbaum,et al.  ELIZA—a computer program for the study of natural language communication between man and machine , 1966, CACM.

[31]  J. Cassell,et al.  Rosebud: a place for interaction between memory, story, and self , 1997, Proceedings Second International Conference on Cognitive Technology Humanizing the Information Age.

[32]  K. Dautenhahn,et al.  Semigroup Expansions for Autobiographic Agents , 1997 .

[33]  A. Sloman,et al.  Towards a Design-Based Analysis of Emotional Episodes , 1996 .

[34]  Chrystopher L. Nehaniv,et al.  What's Your Story? --Irreversibility, Algebra, Autobiographic Agents , 1997 .

[35]  C. Elliott The affective reasoner: a process model of emotions in a multi-agent system , 1992 .

[36]  Rolf Pfeifer,et al.  The "Fungus Eater Approach" to Emotion: A View from Artificial Intelligence , 1994 .