A Review of Eye Gaze in Virtual Agents, Social Robotics and HCI: Behaviour Generation, User Interaction and Perception
暂无分享,去创建一个
Norman I. Badler | Sean Andrist | Bilge Mutlu | Michael Gleicher | Christopher E. Peters | Rachel McDonnell | Jeremy B. Badler | Kerstin Ruhland | Michael Gleicher | N. Badler | J. Badler | Bilge Mutlu | Sean Andrist | R. McDonnell | K. Ruhland | R. Mcdonnell
[1] Dan Witzner Hansen,et al. Eye-based head gestures , 2012, ETRA.
[2] Howell O. Istance,et al. Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.
[3] Brian Scassellati,et al. Are you looking at me? Perception of robot attention is mediated by gaze type and group size , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[4] Sean Andrist,et al. Designing effective gaze mechanisms for virtual agents , 2012, CHI.
[5] J. Loomis,et al. Interpersonal Distance in Immersive Virtual Environments , 2003, Personality & social psychology bulletin.
[6] L. Stark,et al. The main sequence, a tool for studying human eye movements , 1975 .
[7] Yuyu Xu,et al. Virtual character performance from speech , 2013, SCA '13.
[8] Christopher E. Peters,et al. A head movement propensity model for animating gaze shifts and blinks of virtual characters , 2010, Comput. Graph..
[9] H. Hendriks-Jansen. Catching Ourselves in the Act: Situated Activity, Interactive Emergence, Evolution, and Human Thought , 1996 .
[10] Brent Lance,et al. Real-time expressive gaze animation for virtual humans , 2009, AAMAS.
[11] Brent Lance,et al. Emotionally Expressive Head and Body Movement During Gaze Shifts , 2007, IVA.
[12] Neil A. Dodgson,et al. Rendering synthetic ground truth images for eye tracker evaluation , 2014, ETRA.
[13] Brent Lance,et al. Glances, glares, and glowering: how should a virtual human express emotion through gaze? , 2009, Autonomous Agents and Multi-Agent Systems.
[14] Dirk Heylen,et al. Head Gestures, Gaze and the Principles of Conversational Structure , 2006, Int. J. Humanoid Robotics.
[15] J. Fuller,et al. Head movement propensity , 2004, Experimental Brain Research.
[16] A. Mehrabian. Basic Dimensions For A General Psychological Theory , 1980 .
[17] Jeremy N. Bailenson,et al. Equilibrium Theory Revisited: Mutual Gaze and Personal Space in Virtual Environments , 2001, Presence: Teleoperators & Virtual Environments.
[18] Sean Andrist,et al. A head-eye coordination model for animating gaze shifts of virtual characters , 2012, Gaze-In '12.
[19] A. Kendon. Some functions of gaze-direction in social interaction. , 1967, Acta psychologica.
[20] Qianli Xu,et al. Designing engagement-aware agents for multiparty conversations , 2013, CHI.
[21] Mark Steedman,et al. APML, a Markup Language for Believable Behavior Generation , 2004, Life-like characters.
[22] C. Izard. The psychology of emotions , 1991 .
[23] Zhigang Deng,et al. Live Speech Driven Head-and-Eye Motion Generators , 2012, IEEE Transactions on Visualization and Computer Graphics.
[24] Xia Mao,et al. Emotional eye movement generation based on Geneva Emotion Wheel for virtual agents , 2012, J. Vis. Lang. Comput..
[25] S. Gosling,et al. A very brief measure of the Big-Five personality domains , 2003 .
[26] Peter J. Hunter,et al. A virtual environment and model of the eye for surgical simulation , 1994, SIGGRAPH.
[27] A. Fuchs,et al. Lid-eye coordination during vertical gaze changes in man and monkey. , 1988, Journal of neurophysiology.
[28] Fulvio Corno,et al. DOGeye: Controlling your home with eye interaction , 2011, Interact. Comput..
[29] Carlos Busso,et al. Generating Human-Like Behaviors Using Joint, Speech-Driven Models for Conversational Agents , 2012, IEEE Transactions on Audio, Speech, and Language Processing.
[30] Gérard Bailly,et al. Towards eye gaze aware analysis and synthesis of audiovisual speech , 2007, AVSP.
[31] Nadia Magnenat-Thalmann,et al. Realistic Emotional Gaze and Head Behavior Generation Based on Arousal and Dominance Factors , 2010, MIG.
[32] J. Burgoon,et al. Nonverbal Communication: The Unspoken Dialogue , 1988 .
[33] Christopher E. Peters,et al. Attention-driven eye gaze and blinking for virtual humans , 2003, SIGGRAPH '03.
[34] Kristinn R. Thórisson,et al. The Power of a Nod and a Glance: Envelope Vs. Emotional Feedback in Animated Conversational Agents , 1999, Appl. Artif. Intell..
[35] John Funge,et al. Cognitive modeling: knowledge, reasoning and planning for intelligent characters , 1999, SIGGRAPH.
[36] Igor S. Pandzic,et al. On creating multimodal virtual humans—real time speech driven facial gesturing , 2010, Multimedia Tools and Applications.
[37] Elisabetta Bevacqua,et al. A Survey of Listener Behavior and Listener Models for Embodied Conversational Agents , 2013 .
[38] Carol O'Sullivan,et al. Eye-catching crowds: saliency based selective variation , 2009, ACM Trans. Graph..
[39] Lilly Irani,et al. Amazon Mechanical Turk , 2018, Advances in Intelligent Systems and Computing.
[40] Norman I. Badler,et al. Eye Movements, Saccades, and Multiparty Conversations , 2008 .
[41] Justine Cassell,et al. Fully Embodied Conversational Avatars: Making Communicative Behaviors Autonomous , 1999, Autonomous Agents and Multi-Agent Systems.
[42] A. Abele,et al. Functions of gaze in social interaction: Communication and monitoring , 1986 .
[43] W. T. Norman,et al. Toward an adequate taxonomy of personality attributes: replicated factors structure in peer nomination personality ratings. , 1963, Journal of abnormal and social psychology.
[44] Dinesh K. Pai,et al. Eyecatch: simulating visuomotor coordination for object interception , 2012, ACM Trans. Graph..
[45] Demetri Terzopoulos,et al. Artificial fishes: physics, locomotion, perception, behavior , 1994, SIGGRAPH.
[46] Martin Breidt,et al. Face reality: investigating the Uncanny Valley for virtual faces , 2010, SIGGRAPH ASIA.
[47] Kai Vogeley,et al. The effects of self-involvement on attention, arousal, and facial expression during social interaction with virtual others: A psychophysiological study , 2006, Social neuroscience.
[48] S. Tipper,et al. Gaze cueing of attention: visual attention, social cognition, and individual differences. , 2007, Psychological bulletin.
[49] J. M. Kittross. The measurement of meaning , 1959 .
[50] Norman I. Badler,et al. Visual Attention and Eye Gaze During Multiparty Conversations with Distractions , 2006, IVA.
[51] Ludovic Hoyet,et al. Evaluating the effect of emotion on gender recognition in virtual humans , 2013, SAP.
[52] Anthony Steed,et al. Eyelid kinematics for virtual characters , 2010, Comput. Animat. Virtual Worlds.
[53] Norman I. Badler,et al. Evaluating perceived trust from procedurally animated gaze , 2013, MIG.
[54] Shrikanth Narayanan,et al. Learning Expressive Human-Like Head Motion Sequences from Speech , 2008 .
[55] Nicole Chovil. Discourse‐oriented facial displays in conversation , 1991 .
[56] C. Osgood,et al. The Measurement of Meaning , 1958 .
[57] Herwin van Welbergen,et al. Designing Appropriate Feedback for Virtual Agents and Robots. , 2012, HRI 2012.
[58] Eric Horvitz,et al. Facilitating multiparty dialog with gaze, gesture, and speech , 2010, ICMI-MLMI '10.
[59] Norman I. Badler,et al. Eyes alive , 2002, ACM Trans. Graph..
[60] Peter R. Greene,et al. Gaussian and Poisson Blink Statistics: A Preliminary Study , 1986, IEEE Transactions on Biomedical Engineering.
[61] D. Robinson,et al. The vestibulo‐ocular reflex during human saccadic eye movements. , 1986, The Journal of physiology.
[62] A. Kendon. Conducting Interaction: Patterns of Behavior in Focused Encounters , 1990 .
[63] Derek Bradley,et al. High-quality capture of eyes , 2014, ACM Trans. Graph..
[64] J. Stahl,et al. Amplitude of human head movements associated with horizontal saccades , 1999, Experimental Brain Research.
[65] Cynthia Breazeal,et al. Effect of a robot on user perceptions , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).
[66] T. C. Nicholas Graham,et al. Use of eye movements for video game control , 2006, ACE '06.
[67] Elisabeth André,et al. Breaking the Ice in Human-Agent Communication: Eye-Gaze Based Initiation of Contact with an Embodied Conversational Agent , 2009, IVA.
[68] M. Aramideh,et al. Eyelid movements: behavioral studies of blinking in humans under different stimulus conditions. , 2003, Journal of neurophysiology.
[69] J. Cassell,et al. Turn taking vs. Discourse Structure: How Best to Model Multimodal Conversation , 1998 .
[70] Christopher E. Peters. Direction of Attention Perception for Conversation Initiation in Virtual Environments , 2005, IVA.
[71] Gamini Dissanayake,et al. Nonverbal robot-group interaction using an imitated gaze cue , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[72] Kadi Bouatouch,et al. Image-Based Modeling of the Human Eye , 2009, IEEE Transactions on Visualization and Computer Graphics.
[73] Yuichiro Yoshikawa,et al. Responsive Robot Gaze to Interaction Partner , 2006, Robotics: Science and Systems.
[74] Dirk Heylen,et al. First Impressions: Users' Judgments of Virtual Agents' Personality and Interpersonal Attitude in First Encounters , 2012, IVA.
[75] Nicole C. Krämer,et al. It's in Their Eyes: A Study on Female and Male Virtual Humans' Gaze , 2011, IVA.
[76] Erik Reinhard,et al. An Ocularist's Approach to Human Iris Synthesis , 2003, IEEE Computer Graphics and Applications.
[77] Oleg V. Komogortsev,et al. 2D Linear oculomotor plant mathematical model , 2013, ACM Trans. Appl. Percept..
[78] Soraia Raupp Musse,et al. Providing expressive gaze to virtual animated characters in interactive applications , 2008, CIE.
[79] Hung-Hsuan Huang,et al. Implementation and evaluation of a multimodal addressee identification mechanism for multiparty conversation systems , 2013, ICMI '13.
[80] Gérard Bailly,et al. Scrutinizing Natural Scenes: Controlling the Gaze of an Embodied Conversational Agent , 2007, IVA.
[81] Darren Gergle,et al. See what i'm saying?: using Dyadic Mobile Eye tracking to study collaborative reference , 2011, CSCW.
[82] Yukiko I. Nakano,et al. Effectiveness of Gaze-Based Engagement Estimation in Conversational Agents , 2013, Eye Gaze in Intelligent User Interfaces.
[83] Shigeru Kitazawa,et al. Eyeblink entrainment at breakpoints of speech , 2010, Neuroscience Research.
[84] Arvid Kappas,et al. Mixing Implicit and Explicit Probes: Finding a Ground Truth for Engagement in Social Human-Robot Interactions , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[85] Anthony Steed,et al. High-Fidelity Avatar Eye-Representation , 2008, 2008 IEEE Virtual Reality Conference.
[86] Mel Slater,et al. The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment , 2003, CHI '03.
[87] Stefan Kopp,et al. The Behavior Markup Language: Recent Developments and Challenges , 2007, IVA.
[88] Matthias Scheutz,et al. Adaptive eye gaze patterns in interactions with human and artificial agents , 2012, TIIS.
[89] S. Brennan,et al. Speakers' eye gaze disambiguates referring expressions early during face-to-face conversation , 2007 .
[90] W. Murch. In the blink of an eye : a perspective on film editing , 2001 .
[91] R. Wurtz,et al. The Neurobiology of Saccadic Eye Movements , 1989 .
[92] Makoto Sato,et al. Reactive Virtual Human with Bottom-up and Top-down Visual Attention for Gaze Generation in Realtime Interactions , 2007, 2007 IEEE Virtual Reality Conference.
[93] J. Cassell,et al. Intersubjectivity in human-agent interaction , 2007 .
[94] Matthew W. Crocker,et al. The effect of robot gaze on processing robot utterances , 2009 .
[95] Norihiro Hagita,et al. Messages embedded in gaze of interface agents --- impression management with agent's gaze , 2002, CHI.
[96] Karon E. MacLean,et al. Meet Me where I’m Gazing: How Shared Attention Gaze Affects Human-Robot Handover Timing , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[97] Anthony Steed,et al. Short Paper: Exploring the Object Relevance of a Gaze Animation Model , 2011, EGVE/EuroVR.
[98] Mohammad Obaid,et al. Cultural Behaviors of Virtual Agents in an Augmented Reality Environment , 2012, IVA.
[99] Laurent Itti,et al. Photorealistic Attention-Based Gaze Animation , 2006, 2006 IEEE International Conference on Multimedia and Expo.
[100] John Lasseter. Tricks to animating characters with a computer , 2001, COMG.
[101] J. Harbison,et al. The Neurology of Eye Movements, 3rd ed , 2000 .
[102] P. Ekman,et al. Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .
[103] Jason Osipa. Stop Staring: Facial Modeling and Animation Done Right , 2003 .
[104] Michael D. Buhrmester,et al. Amazon's Mechanical Turk , 2011, Perspectives on psychological science : a journal of the Association for Psychological Science.
[105] Christophe Garcia,et al. Modeling Gaze Behavior for a 3D ECA in a Dialogue Situation , 2005, Gesture Workshop.
[106] Sean Andrist,et al. Conversational Gaze Aversion for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[107] Brent Lance,et al. The Rickel Gaze Model: A Window on the Mind of a Virtual Human , 2007, IVA.
[108] Takayuki Kanda,et al. Footing in human-robot conversations: How robots might shape participant roles using gaze cues , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[109] J. P. Otteson,et al. Effect of Teacher's Gaze on Children's Story Recall , 1980 .
[110] Oleg Spakov,et al. Enhanced gaze interaction using simple head gestures , 2012, UbiComp.
[111] Andreas Nürnberger,et al. Designing gaze-supported multimodal interactions for the exploration of large image collections , 2011, NGCA '11.
[112] Bilge Mutlu,et al. Modeling and Evaluating Narrative Gestures for Humanlike Robots , 2013, Robotics: Science and Systems.
[113] Michael Kipp,et al. IGaze: Studying Reactive Gaze Behavior in Semi-immersive Human-Avatar Interactions , 2008, IVA.
[114] Francis Glebas. The Animator's Eye : Composition and Design for Better Animation , 2013 .
[115] Iain Matthews,et al. Modeling and animating eye blinks , 2011, TAP.
[116] M. Bradley,et al. The pupil as a measure of emotional arousal and autonomic activation. , 2008, Psychophysiology.
[117] Henna Heikkilä,et al. Tools for a Gaze-Controlled Drawing Application - Comparing Gaze Gestures against Dwell Buttons , 2013, INTERACT.
[118] L. Stark,et al. Most naturally occurring human saccades have magnitudes of 15 degrees or less. , 1975, Investigative ophthalmology.
[119] Bilge Mutlu,et al. Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[120] Michele A. Basso,et al. Not looking while leaping: the linkage of blinking and saccadic gaze shifts , 2004, Experimental Brain Research.
[121] John P. Lewis,et al. Automated eye motion using texture synthesis , 2005, IEEE Computer Graphics and Applications.
[122] Catharine Oertel,et al. Gaze direction as a Back-Channel inviting Cue in Dialogue , 2012 .
[123] Daniel Thalmann,et al. Simulating gaze attention behaviors for crowds , 2009, Comput. Animat. Virtual Worlds.
[124] Jessica K. Hodgins,et al. The saliency of anomalies in animated human characters , 2010, TAP.
[125] D. Guitton,et al. Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. , 1987, Journal of neurophysiology.
[126] Panagiotis G. Ipeirotis,et al. Running Experiments on Amazon Mechanical Turk , 2010, Judgment and Decision Making.
[127] Ipke Wachsmuth,et al. An operational model of joint attention - Timing of the initiate-act in interactions with a virtual human , 2012 .
[128] G. Sjøgaard,et al. Eye blink frequency during different computer tasks quantified by electrooculography , 2006, European Journal of Applied Physiology.
[129] Jacob O. Wobbrock,et al. Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures , 2007 .
[130] Ning Wang,et al. Don't just stare at me! , 2010, CHI.
[131] H. H. Clark,et al. Speaking while monitoring addressees for understanding , 2004 .
[132] Radoslaw Niewiadomski,et al. Computational Models of Expressive Behaviors for a Virtual Agent. , 2012 .
[133] V. Yngve. On getting a word in edgewise , 1970 .
[134] John Paulin Hansen,et al. Single gaze gestures , 2010, ETRA '10.
[135] N. Badler,et al. Eyes Alive Eyes Alive Eyes Alive Figure 1: Sample Images of an Animated Face with Eye Movements , 2022 .
[136] I. Scott MacKenzie,et al. The use of gaze to control drones , 2014, ETRA.
[137] Robin L. Hill,et al. Referring and gaze alignment: accessibility is alive and well in situated dialogue , 2009 .
[138] Elisabetta Bevacqua,et al. A Model of Attention and Interest Using Gaze Behavior , 2005, IVA.
[139] Bilge Mutlu,et al. Stylized and Performative Gaze for Character Animation , 2013, Comput. Graph. Forum.
[140] Jens Edlund,et al. Taming Mona Lisa: Communicating gaze faithfully in 2D and 3D facial projections , 2012, TIIS.
[141] S. Duncan,et al. On the structure of speaker–auditor interaction during speaking turns , 1974, Language in Society.
[142] Sven Behnke,et al. Integrating vision and speech for conversations with multiple persons , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[143] Antonio Camurri,et al. Fundamentals of Agent Perception and Attention Modelling , 2011 .
[144] Stefan Kopp,et al. Using Virtual Agents to Guide Attention in Multi-task Scenarios , 2013, IVA.
[145] Yukiko I. Nakano,et al. Estimating user's engagement from eye-gaze behaviors in human-agent conversations , 2010, IUI '10.
[146] J. Stern,et al. The endogenous eyeblink. , 1984, Psychophysiology.
[147] Gladimir V. G. Baranoski,et al. A Predictive Light Transport Model for the Human Iris , 2006, Comput. Graph. Forum.
[148] Kari-Jouko Räihä,et al. Simple gaze gestures and the closure of the eyes as an interaction technique , 2012, ETRA.
[149] Brigitte Krenn,et al. Embodied Conversational Characters: Representation Formats for Multimodal Communicative Behaviours , 2011 .
[150] Steve Garner,et al. Evaluating an eye tracking interface for a two-dimensional sketch editor , 2013, Comput. Aided Des..
[151] Dirk Heylen,et al. Generating Nonverbal Signals for a Sensitive Artificial Listener , 2007, COST 2102 Workshop.
[152] D. Gática-Pérez. Modelling Interest in Face-to-Face Conversations from Multimodal Nonverbal Behaviour , 2010 .
[153] S G Lisberger,et al. Visual motion processing for the initiation of smooth-pursuit eye movements in humans. , 1986, Journal of neurophysiology.
[154] Dirk Heylen,et al. Highly Realistic 3D Presentation Agents with Visual Attention Capability , 2007, Smart Graphics.
[155] Carol O'Sullivan,et al. Clone attack! Perception of crowd variety , 2008, ACM Trans. Graph..
[156] Timothy W. Bickmore,et al. Changes in verbal and nonverbal conversational behavior in long-term interaction , 2012, ICMI '12.
[157] Justine Cassell,et al. BodyChat: autonomous communicative behaviors in avatars , 1998, AGENTS '98.
[158] Anthony Steed,et al. An assessment of eye-gaze potential within immersive virtual environments , 2007, TOMCCAP.
[159] Jaap Ham,et al. I Didn't Know That Virtual Agent Was Angry at Me: Investigating Effects of Gaze Direction on Emotion Recognition and Evaluation , 2013, PERSUASIVE.
[160] Brent Lance,et al. The Expressive Gaze Model: Using Gaze to Express Emotion , 2010, IEEE Computer Graphics and Applications.
[161] M. Doughty. Consideration of Three Types of Spontaneous Eyeblink Activity in Normal Humans: during Reading and Video Display Terminal Use, in Primary Gaze, and while in Conversation , 2001, Optometry and vision science : official publication of the American Academy of Optometry.
[162] Stacy Marsella,et al. Towards Expressive Gaze Manner in Embodied Virtual Agents , 2004 .
[163] Neil A. Dodgson,et al. Eye movements and attention for behavioural animation , 2002, Comput. Animat. Virtual Worlds.
[164] Junichi Hoshino,et al. Head‐eye Animation Corresponding to a Conversation for CG Characters , 2007, Comput. Graph. Forum.
[165] Mel Slater,et al. An Eye Gaze Model for Dyadic Interaction in an Immersive Virtual Environment: Practice and Experience , 2004, Comput. Graph. Forum.
[166] K. Kroschel,et al. Evaluation of natural emotions using self assessment manikins , 2005, IEEE Workshop on Automatic Speech Recognition and Understanding, 2005..
[167] D. Olson,et al. Developing theories of mind , 1988 .
[168] Dominik Schmidt,et al. Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch , 2013, INTERACT.
[169] Anthony Steed,et al. A saliency-based method of simulating visual attention in virtual scenes , 2009, VRST '09.
[170] C. Koch,et al. Models of bottom-up and top-down visual attention , 2000 .
[171] Robin J. S. Sloan,et al. Using virtual agents to cue observer attention , 2010 .
[172] Andrew Tucker,et al. Monitoring eye and eyelid movements by infrared reflectance oculography to measure drowsiness in drivers , 2007 .
[173] Albrecht Schmidt,et al. Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.
[174] Michael Andres,et al. Dissociable roles of the human somatosensory and superior temporal cortices for processing social face signals , 2004, The European journal of neuroscience.
[175] Steve Roberts,et al. Character Animation: 2D Skills for Better 3D , 2012 .
[176] F. Thomas,et al. The illusion of life : Disney animation , 1981 .
[177] Anthony Steed,et al. Modelling selective visual attention for autonomous virtual characters , 2011, Comput. Animat. Virtual Worlds.
[178] Veronica Sundstedt,et al. Gaze and voice based game interaction: the revenge of the killer penguins , 2008, SIGGRAPH '08.
[179] Norman I. Badler,et al. Where to Look? Automating Attending Behaviors of Virtual Human Characters , 1999, AGENTS '99.
[180] Heloir,et al. The Uncanny Valley , 2019, The Animation Studies Reader.
[181] John Paulin Hansen,et al. Gaze input for mobile devices by dwell and gestures , 2012, ETRA.
[182] Dirk Heylen,et al. Gaze behaviour, believability, likability and the iCat , 2009, AI & SOCIETY.
[183] Armando Barreto,et al. Performance analysis of an integrated eye gaze tracking / electromyogram cursor control system , 2007, Assets '07.
[184] Soraia Raupp Musse,et al. Automatic Generation of Expressive Gaze in Virtual Animated Characters: From Artists Craft to a Behavioral Animation Model , 2007, IVA.
[185] Marina L. Gavrilova,et al. Iris synthesis: a reverse subdivision application , 2005, GRAPHITE.
[186] Stefan Kopp,et al. Towards a Common Framework for Multimodal Generation: The Behavior Markup Language , 2006, IVA.
[187] Robin R. Murphy,et al. A survey of social gaze , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[188] Zhigang Deng,et al. Rigid Head Motion in Expressive Speech Animation: Analysis and Synthesis , 2007, IEEE Transactions on Audio, Speech, and Language Processing.
[189] Catherine Pelachaud,et al. Eye Communication in a Conversational 3D Synthetic Agent , 2000, AI Commun..
[190] D Guitton,et al. Upper eyelid movements measured with a search coil during blinks and vertical saccades. , 1991, Investigative ophthalmology & visual science.
[191] David G. Novick,et al. A Computational Model of Culture-Specific Conversational Behavior , 2007, IVA.
[192] M. Argyle,et al. Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.
[193] Gin McCollum,et al. Variables Contributing to the Coordination of Rapid Eye/Head Gaze Shifts , 2006, Biological Cybernetics.
[194] Richard Bandler,et al. Frogs into princes : neuro linguistic programming , 1979 .
[195] Matej Rojc,et al. A Survey of Listener Behavior and Listener Models for Embodied Conversational Agents , 2013 .
[196] Susan R. Fussell,et al. Comparing a computer agent with a humanoid robot , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[197] Yukiko I. Nakano,et al. Avatar's Gaze Control to Facilitate Conversational Turn-Taking in Virtual-Space Multi-user Voice Chat System , 2006, IVA.
[198] Trevor Darrell,et al. Conditional Sequence Model for Context-Based Recognition of Gaze Aversion , 2007, MLMI.
[199] Henna Heikkilä. EyeSketch: a drawing application for gaze control , 2013, ETSA '13.
[200] S. Baron-Cohen,et al. Is There a "Language of the Eyes"? Evidence from Normal Adults, and Adults with Autism or Asperger Syndrome , 1997 .
[201] Brian Scassellati,et al. Robot gaze does not reflexively cue human attention , 2011, CogSci.
[202] Manuel Menezes de Oliveira Neto,et al. Photorealistic models for pupil light reflex and iridal pattern deformation , 2009, TOGS.
[203] S. Brennan. Eye gaze cues for coordination in collaborative tasks , 2011 .
[204] Christopher M. Harris,et al. The distribution of fixation durations in infants and naive adults , 1988, Vision Research.
[205] Daniel Thalmann,et al. Coordinating the Generation of Signs in Multiple Modalities in an Affective Agent , 2011 .
[206] Ari Shapiro,et al. Building a Character Animation System , 2011, MIG.
[207] C. Evinger,et al. Eyelid movements. Mechanisms and normal data. , 1991, Investigative ophthalmology & visual science.
[208] T. Allison,et al. Electrophysiological Studies of Face Perception in Humans , 1996, Journal of Cognitive Neuroscience.
[209] Marilyn A. Walker,et al. Bossy or Wimpy: Expressing Social Dominance by Combining Gaze and Linguistic Behaviors , 2010, IVA.
[210] Michael Banf,et al. Example‐Based Rendering of Eye Movements , 2009, Comput. Graph. Forum.
[211] S. Drucker,et al. The Role of Eye Gaze in Avatar Mediated Conversational Interfaces , 2000 .
[212] Mitsuru Ishizuka,et al. Attentive Presentation Agents , 2007, IVA.
[213] Ning Wang,et al. Creating Rapport with Virtual Agents , 2007, IVA.
[214] R. Likert. “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.
[215] Maja Pantic,et al. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING , 2022 .
[216] C. Anderson,et al. PVT lapses differ according to eyes open, closed, or looking away. , 2010, Sleep.
[217] Makoto Kato,et al. Blink-related momentary activation of the default mode network while viewing videos , 2012, Proceedings of the National Academy of Sciences.
[218] Andreas Paepcke,et al. EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.
[219] Irene Albrecht,et al. Automatic Generation of Non-Verbal Facial Expressions from Speech , 2002 .
[220] John K. Tsotsos,et al. Modeling Visual Attention via Selective Tuning , 1995, Artif. Intell..
[221] Christopher E. Peters,et al. How Does Varying Gaze Direction Affect Interaction between a Virtual Agent and Participant in an On-Line Communication Scenario? , 2014, HCI.
[222] Denise Goodwin,et al. Clinical Anatomy and Physiology of the Visual System , 1997 .
[223] Zhigang Deng,et al. Realistic Eye Motion Synthesis by Texture Synthesis , 2008 .
[224] M. Cary. The Role of Gaze in the Initiation of Conversation , 1978 .
[225] Ed Hooks,et al. Acting for Animators , 2001 .
[226] Hannes Högni Vilhjálmsson. Animating Conversation in Online Games , 2004, ICEC.
[227] Stacy Marsella,et al. Expressive Behaviors for Virtual Worlds , 2004, Life-like characters.
[228] Kostas Karpouzis,et al. Investigating shared attention with a virtual agent using a gaze-based interface , 2010, Journal on Multimodal User Interfaces.
[229] Peter Oppenheimer,et al. Real time design and animation of fractal plants and trees , 1986, SIGGRAPH.
[230] Jörn Ostermann,et al. Video-realistic image-based eye animation via statistically driven state machines , 2010, The Visual Computer.
[231] N. Bee,et al. Relations between facial display, eye gaze and head tilt: Dominance perception variations of virtual agents , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.
[232] Poika Isokoski,et al. Gazing and frowning as a new human--computer interaction technique , 2004, TAP.
[233] B. Fischer,et al. Human express saccades: extremely short reaction times of goal directed eye movements , 2004, Experimental Brain Research.
[234] K. Shirai,et al. Controlling gaze of humanoid in communication with human , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).
[235] S. Baron-Cohen. How to build a baby that can read minds: Cognitive mechanisms in mindreading. , 1994 .
[236] Tony Belpaeme,et al. A study of a retro-projected robotic face and its effectiveness for gaze reading by humans , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[237] Raimund Dachselt,et al. Look & touch: gaze-supported target acquisition , 2012, CHI.
[238] Per Ola Kristensson,et al. The potential of dwell-free eye-typing for fast assistive gaze communication , 2012, ETRA.
[239] Christopher E. Peters. Animating Gaze Shifts for Virtual Characters Based on Head Movement Propensity , 2010, 2010 Second International Conference on Games and Virtual Worlds for Serious Applications.
[240] Ana Paiva,et al. Providing Gender to Embodied Conversational Agents , 2011, International Conference on Intelligent Virtual Agents.
[241] Alvin W. Yeo,et al. Gaze estimation model for eye drawing , 2006, CHI EA '06.
[242] Mark Steedman,et al. Animated conversation: rule-based generation of facial expression, gesture & spoken intonation for multiple conversational agents , 1994, SIGGRAPH.
[243] Sean Andrist,et al. Conversational Gaze Aversion for Virtual Agents , 2013, IVA.
[244] John H. Anderson,et al. OCULAR TORSION IN THE CAT AFTER LESIONS OF THE INTERSTITIAL NUCLEUS OF CAJAL * , 1981, Annals of the New York Academy of Sciences.
[245] Rae A. Earnshaw,et al. Advances in Modelling, Animation and Rendering , 2002, Springer London.
[246] Marc Cavazza,et al. Gaze behavior during interaction with a virtual character in interactive storytelling , 2010 .
[247] Irwin Silverman,et al. Pupillometry: A sexual selection approach , 2004 .
[248] Justine Cassell,et al. BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.
[249] Elisabeth André,et al. Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze , 2008, PIT.
[250] Du-Sik Park,et al. 3D user interface combining gaze and hand gestures for large-scale display , 2010, CHI EA '10.
[251] Ginevra Castellano,et al. An exploration of user engagement in HCI , 2009, AFFINE '09.
[252] Md. Golam Rashed,et al. Recognizing Gaze Pattern for Human Robot Interaction , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[253] Christopher E. Peters,et al. Bottom-up visual attention for virtual human animation , 2003, Proceedings 11th IEEE International Workshop on Program Comprehension.
[254] Heinrich H. Bülthoff,et al. Render me real? , 2012, ACM Trans. Graph..
[255] Gabriel Skantze,et al. Exploring the effects of gaze and pauses in situated human-robot interaction , 2013, SIGDIAL Conference.
[256] Dmitry Sokolov,et al. On Smooth 3D Frame Field Design , 2015, ArXiv.
[257] Ginevra Castellano,et al. Identifying Task Engagement: Towards Personalised Interactions with Educational Robots , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.
[258] C. Kleinke. Gaze and eye contact: a research review. , 1986, Psychological bulletin.
[259] Hannes Högni Vilhjálmsson,et al. Animating Idle Gaze in Public Places , 2009, IVA.
[260] Christopher E. Peters. Evaluating Perception of Interaction Initiation in Virtual Environments Using Humanoid Agents , 2006, ECAI.
[261] Y. Nakano,et al. Gaze and Conversation Dominance in Multiparty Interaction , 2011 .
[262] Richard Williams. The Animator's Survival Kit--Revised Edition: A Manual of Methods, Principles and Formulas for Classical, Computer, Games, Stop Motion and Internet Animators , 2009 .
[263] Laurent Itti,et al. Realistic avatar eye and head animation using a neurobiological model of visual attention , 2004, SPIE Optics + Photonics.
[264] Brian Scassellati,et al. The Benefits of Interactions with Physically Present Robots over Video-Displayed Agents , 2011, Int. J. Soc. Robotics.
[265] Cristina Conati,et al. 2nd workshop on eye gaze in intelligent human machine interaction , 2011, IUI '11.
[266] Natalia A. Schmid,et al. A Model Based, Anatomy Based Method for Synthesizing Iris Images , 2006, ICB.
[267] Veronica Sundstedt,et al. Gaze and voice controlled drawing , 2011, NGCA '11.
[268] Antti Poikola,et al. Invisible eni: using gaze and pupil size to control a game , 2008, CHI Extended Abstracts.