Where to look? automating certain visual attending behaviors of human characters

This thesis proposes a computational framework for generating visual attending behavior in an embodied simulated human agent. Such behaviors directly control eye and head motions, and guide other actions such as locomotion and reach. The implementation of these concepts, referred to as the AVA , draws on empirical and qualitative observations known from psychology, human factors and computer vision. Deliberate behaviors, the analogs of scanpaths in visual psychology, compete with involuntary attention capture and lapses into idling or free viewing. For efficiency, the embodied agent is assumed to have access to certain properties of the 3D world (scene graph) stored in the graphical environment. When information about a task is known, the scene graph is queried. When an agent lapses into free viewing or idling, no task constraints are active so a simplified image analysis technique is employed to select potential directions of interest. Insights provided by implementing this frame work are: a defined set of parameters that impact the observable effects of attention, a defined vocabulary of looking behaviors for certain motor and cognitive activity, a defined hierarchy of three levels of eye behavior (endogenous, exogenous and idling) and a proposed method of how these types interact, a technique of modifying motor activity based on visual inputs, and a technique that allows for anticipation and interleaving of eye behaviors for sequential motor actions. AVA generated behavior is emergent and responds to environment context and dynamics. Further, this method animates behavior at interactive rates. Experiments supporting several combinations of environment and attending conditions are demonstrated, followed by a discussion of an evaluation of AVA effectiveness.

[1]  Daniel Thalmann,et al.  Navigation for digital actors based on synthetic vision, memory, and learning , 1995, Comput. Graph..

[2]  Demetri Terzopoulos,et al.  Artificial Fishes: Autonomous Locomotion, Perception, Behavior, and Learning in a Simulated Physical World , 1994, Artificial Life.

[3]  Keith Waters,et al.  Computer facial animation , 1996 .

[4]  Steven D. Pieper,et al.  Control and Coordination of Head, Eyes, and Facial Expressions of Virtual Actors in Virtual Environments , 1996, Presence: Teleoperators & Virtual Environments.

[6]  C D Frith,et al.  Modulating irrelevant motion perception by varying attentional load in an unrelated task. , 1997, Science.

[7]  Brian Scassellati Mechanisms of Shared Attention for a Humanoid Robot , 1998 .

[8]  Norman I. Badler,et al.  A Virtual Human Presenter , 1997 .

[9]  S. Yantis,et al.  Visual attention: control, representation, and time course. , 1997, Annual review of psychology.

[10]  Mark Steedman,et al.  Animated conversation: rule-based generation of facial expression, gesture & spoken intonation for multiple conversational agents , 1994, SIGGRAPH.

[11]  Michael I. Posner,et al.  14 Attention and the Control of Movements , 1980 .

[12]  F. Thomas,et al.  Disney Animation: The Illusion of Life , 1981 .

[13]  Joseph Rosen,et al.  The virtual sailor: An implementation of interactive human body modeling , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[14]  M. Posner,et al.  Inhibition of return : Neural basis and function , 1985 .

[15]  Craig W. Reynolds Flocks, herds, and schools: a distributed behavioral model , 1987, SIGGRAPH.

[16]  John K. Tsotsos,et al.  Modeling Visual Attention via Selective Tuning , 1995, Artif. Intell..

[17]  D. Sparks,et al.  Eye-head coordination during head-unrestrained gaze shifts in rhesus monkeys. , 1997, Journal of neurophysiology.

[18]  Brian H. Tsou,et al.  Grating and flicker sensitivity in the near and far periphery: Naso-temporal asymmetries and binocular summation , 1994, Vision Research.

[19]  Jonathan D. Cohen,et al.  Progress in the use of interactive models for understanding attention and performance , 1994 .

[20]  Brian Wyvill,et al.  Speech and expression: a computer solution to face animation , 1986 .

[21]  R W Remington,et al.  The structure of attentional control: contingent attentional capture by apparent motion, abrupt onset, and color. , 1994, Journal of experimental psychology. Human perception and performance.

[22]  S. Yantis,et al.  Abrupt visual onsets and selective attention: voluntary versus automatic allocation. , 1990, Journal of experimental psychology. Human perception and performance.

[23]  James L. McClelland,et al.  On the control of automatic processes: a parallel distributed processing account of the Stroop effect. , 1990, Psychological review.

[24]  W. Lewis Johnson,et al.  Integrating pedagogical capabilities in a virtual environment agent , 1997, AGENTS '97.

[25]  James T. Enns,et al.  High-speed visual estimation using preattentive processing , 1996, TCHI.

[26]  E. Ldavas,et al.  Eye Movements and Orienting of Attention in Patients with Visual Neglect , 1997, Journal of Cognitive Neuroscience.

[27]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[28]  D. Meyer,et al.  Eye-hand coordination: oculomotor control in rapid aimed limb movements. , 1990, Journal of experimental psychology. Human perception and performance.

[29]  Zeltzer,et al.  Motor Control Techniques for Figure Animation , 1982, IEEE Computer Graphics and Applications.

[30]  S. Yantis Stimulus-driven attentional capture and attentional control settings. , 1993, Journal of experimental psychology. Human perception and performance.

[31]  L. Stark,et al.  Experimental metaphysics: The scanpath as an epistemological mechanism , 1996 .

[32]  Rajesh P. N. Rao,et al.  Modeling Saccadic Targeting in Visual Search , 1995, NIPS.

[33]  D. Alan Allport,et al.  SHIFTING INTENTIONAL SET - EXPLORING THE DYNAMIC CONTROL OF TASKS , 1994 .

[34]  M D Escobar,et al.  Eye tracking dysfunction in schizophrenia: characterization of component eye movement abnormalities, diagnostic specificity, and the role of attention. , 1994, Journal of abnormal psychology.

[35]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[36]  J. Jonides Voluntary versus automatic control over the mind's eye's movement , 1981 .

[37]  Brian Scassellati,et al.  Alternative Essences of Intelligence , 1998, AAAI/IAAI.

[38]  S Shimojo,et al.  Orienting a spatial attention--its reflexive, compensatory, and voluntary mechanisms. , 1996, Brain research. Cognitive brain research.

[39]  Daniel Thalmann,et al.  SMILE: A Multilayered Facial Animation System , 1991, Modeling in Computer Graphics.

[40]  N. Moray Designing for attention. , 1993 .

[41]  Lawrence W. Stark,et al.  Visual perception and sequences of eye movement fixations: a stochastic modeling approach , 1992, IEEE Trans. Syst. Man Cybern..

[42]  S. Yantis,et al.  Visual motion and attentional capture , 1994, Perception & psychophysics.

[43]  R. Klein,et al.  Does Oculomotor Readiness Mediate Cognitive Control of Visual-Attention - Revisited , 1994 .

[44]  C. L. M. The Psychology of Attention , 1890, Nature.

[45]  Kristinn R. Thórisson,et al.  The Power of a Nod and a Glance: Envelope Vs. Emotional Feedback in Animated Conversational Agents , 1999, Appl. Artif. Intell..

[46]  S Ullman,et al.  Shifts in selective visual attention: towards the underlying neural circuitry. , 1985, Human neurobiology.