Architectural requirements for human-like agents both natural and artificial

This paper, an expanded version of a talk on love given to a lit erary society, attempts to analyse some of the architectural requirements for an age t which is capable of having primary, secondary and tertiary emotions, includin g being infatuated or in love. It elaborates on work done previously in the Birmingham Cogn itio and Affect group, describing our proposed three level architecture (with rea ctive, deliberative and metamanagement layers), showing how different sorts of emotion s relate to those layers. Some of the relationships between emotional states involvi ng partial loss of control of attention (e.g. emotional states involved in being in lov e) and other states which involve dispositions (e.g. attitudes such as loving) are di scussed and related to the architecture. The work of poets and playwrights can be shown to involve an im plicit commitment to the hypothesis that minds are (at least) information proces sing engines. Besides loving, many other familiar states and processes such as seeing, dec iding, wondering whether, hoping, regretting, enjoying, disliking, learning, plann i g and acting all involve various sorts of information processing. By analysing the requirements for such processes to occur, a nd relating them to our evolutionary history and what is known about animal brains, and comparing this with what is being learnt from work on artificial minds in artificia l intelligence, we can begin to formulate new and deeper theories about how minds wo rk, including how we come to think about qualia, many forms of learning and develo pment, and results of brain dmange or abnormality. But there is much prejudice that gets in the way of such theori sing, and also much misunderstanding because people construe notions of “info rmation processing” too narrowly. Architectural Requirements for Human-like Agents Both Natural and Artificial. (What sorts of machines can love?)

[1]  A. Damasio Descarte's error : emotion, reason, and the human brain , 1994 .

[2]  Aaron Sloman,et al.  Why Robots Will Have Emotions , 1981, IJCAI.

[3]  Aaron Slomon,et al.  On designing a visual system# (towards a Gibsonian computational model of vision) , 1990 .

[4]  G. Ryle,et al.  心的概念 = The concept of mind , 1962 .

[5]  E. Robinson Cybernetics, or Control and Communication in the Animal and the Machine , 1963 .

[6]  D. Dennett Kinds of Minds: Towards an Understanding of Consciousness , 1996 .

[7]  Aaron Sloman,et al.  Towards a grammar of emotions , 1982 .

[8]  Luc Beaudoin Goal Processing in Autonomous Agents , 1994 .

[9]  A. Sloman,et al.  A Study of Motive Processing and Attention , 1993 .

[10]  Aaron Sloman,et al.  Prolegomena to a Theory of Communication and Affect , 1992 .

[11]  A. Sloman,et al.  The computer revolution in philosophy , 1978 .

[12]  Nils J. Nilsson,et al.  Teleo-Reactive Programs for Agent Control , 1993, J. Artif. Intell. Res..

[13]  H. Simon,et al.  Motivational and emotional controls of cognition. , 1967, Psychological review.

[14]  Aaron Sloman,et al.  Beyond Turing Equivalence , 1996 .

[15]  P. Johnson-Laird,et al.  Towards a Cognitive Theory of Emotions , 1987 .

[16]  Aaron Sloman,et al.  What Sort of Control System Is Able to Have a Personality? , 1997, Creating Personalities for Synthetic Actors.

[17]  S. Mithen The Prehistory of the Mind , 1996 .

[18]  Aaron Sloman,et al.  Prospects for AI as the General Science of Intelligence , 1993 .