Using Body Language of Avatars in VR Meetings as Communication Status Cue

While traditional videoconferencing causes privacy issues, virtual meetings are not yet widely used. Their communication quality still lacks usability and important non-verbal communication cues, such as body language, are underrepresented. We aim at exploring virtual avatars’ body language and how it can be used to indicate meeting attendees’ communication status. By comparing users’ perceptions of avatar behavior, we found that avatar body language across gender can be an indication of communication willingness. We derive resulting body language design recommendations and recommend using attentively behaving avatars as default body language and to indicate being busy through actions of the avatar, such as drinking, typing, or talking on a phone. These actions indicate that users are temporarily busy with another task, but still are attending the meeting. When users are unavailable, their avatars should not be displayed at all and in cases of longer meeting interruptions, the avatar of a user should leave the virtual meeting room.

[1]  N. Lehmann-Willenbrock,et al.  A First Pilot Study to Compare Virtual Group Meetings using Video Conferences and (Immersive) Virtual Reality , 2020, SUI.

[2]  Martin Baumann,et al.  A Longitudinal Video Study on Communicating Status and Intent for Self-Driving Vehicle – Pedestrian Interaction , 2020, CHI.

[3]  Jing Yang,et al.  A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing , 2020, CHI.

[4]  Alexis Hiniker,et al.  User Experiences with Online Status Indicators , 2020, CHI.

[5]  Marc Erich Latoschik,et al.  Technologies for Social Augmentations in User-Embodied Virtual Reality , 2019, VRST.

[6]  Mei Lei,et al.  Influence of Instructors' Body Language on Students' Learning Outcome in Micro Lectures , 2019, ICETC.

[7]  W. Rogers Human communication , 2019, Perspectives on Social Psychology.

[8]  Jennifer Marlow,et al.  Exploring the Effects of Audience Visibility on Presenters and Attendees in Online Educational Presentations , 2017, C&T.

[9]  Mei Si,et al.  Using Facial Expression and Body Language to Express Attitude for Non-Humanoid Robot: (Extended Abstract) , 2016, AAMAS.

[10]  Ryan Shaun Joazeiro de Baker,et al.  Automatic Detection of Learning-Centered Affective States in the Wild , 2015, IUI.

[11]  Jens Hainmueller,et al.  Validating vignette and conjoint survey experiments against real-world behavior , 2015, Proceedings of the National Academy of Sciences.

[12]  Bertrand Schneider,et al.  Real-time mutual gaze perception enhances collaborative learning and collaboration quality , 2013, International Journal of Computer-Supported Collaborative Learning.

[13]  Bilge Mutlu,et al.  MACH: my automated conversation coach , 2013, UbiComp.

[14]  Brett Stevens,et al.  Emotional body language displayed by artificial agents , 2012, TIIS.

[15]  Andrea Kleinsmith,et al.  Form as a Cue in the Automatic Recognition of Non-acted Affective Body Expressions , 2011, ACII.

[16]  Jun Rekimoto,et al.  Smiling makes us happier: enhancing positive mood and communication with smile-encouraging digital appliances , 2011, UbiComp '11.

[17]  Peter M. Steiner,et al.  Experimental Vignette Studies in Survey Research , 2010 .

[18]  Christopher D. Wickens,et al.  Multiple Resources and Mental Workload , 2008, Hum. Factors.

[19]  Loïc Kessous,et al.  Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech , 2008, Affect and Emotion in Human-Computer Interaction.

[20]  Carol Kinsey Goman,et al.  The nonverbal advantage : secrets and science of body language at work , 2008 .

[21]  Andrea Kleinsmith,et al.  Recognizing Affective Dimensions from Body Posture , 2007, ACII.

[22]  John C. Tang,et al.  Lilsys: Sensing Unavailability , 2004, CSCW.

[23]  Anind K. Dey,et al.  Exploring the design and use of peripheral displays of awareness information , 2004, CHI EA '04.

[24]  Mark C. Coulson Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence , 2004 .

[25]  N Berthouze,et al.  Learning to recognize affective body postures , 2003, The 3rd International Workshop on Scientific Use of Submarine Cables and Related Technologies, 2003..

[26]  Audris Mockus,et al.  handiMessenger: Awareness-Enhanced Universal Communication for Mobile Users , 2002, Mobile HCI.

[27]  Eric Horvitz,et al.  Coordinates: Probabilistic Forecasting of Presence and Availability , 2002, UAI.

[28]  Bonnie A. Nardi,et al.  Interaction and outeraction: instant messaging in action , 2000, CSCW '00.

[29]  Hedwig Lewis,et al.  Body Language: A Guide for Professionals , 2000 .

[30]  Jolanda G. Tromp,et al.  Virtual body language: providing appropriate user interfaces in collaborative virtual environments , 1997, VRST '97.

[31]  Saul Greenberg,et al.  Peepholes: low cost awareness of one's community , 1996, CHI Conference Companion.

[32]  Steve Benford,et al.  User embodiment in collaborative virtual environments , 1995, CHI '95.

[33]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[34]  Eric L. Einspruch,et al.  Observations concerning Research Literature on Neuro-Linguistic Programming. , 1985 .

[35]  D. Watson,et al.  Toward a consensual structure of mood. , 1985, Psychological bulletin.

[36]  Peter H. Rossi,et al.  Measuring social judgments : the factorial survey approach , 1983 .

[37]  Shelley Masion Rosenberg Bodily Communication , 1978 .

[38]  Barry Bricklin,et al.  Messages of the Body , 1976 .

[39]  Vincent Tran Positive Affect Negative Affect Scale (PANAS) , 2020, Encyclopedia of Behavioral Medicine.

[40]  C. Wrzus,et al.  Positive Affect , 2020, Encyclopedia of Personality and Individual Differences.

[41]  Bianka Breyer,et al.  Deutsche Version der Positive and Negative Affect Schedule PANAS (GESIS Panel) , 2016 .

[42]  Š.,et al.  Nonverbal Communication From The Other Side : Speaking Body Language , 2015 .

[43]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[44]  M. Lombard,et al.  Measuring Presence: The Temple Presence Inventory , 2009 .

[45]  Thomas Rist,et al.  Integrating Models of Personality and Emotions into Lifelike Characters , 1999, IWAI.

[46]  Alan J. Dix,et al.  Four generic communication tasks which must be supported in electronic conferencing , 1991, SGCH.

[47]  C. Sharpley Research findings on neurolinguistic programming: nonsupportive data or an untestable theory? , 1987 .

[48]  J. Mccroskey,et al.  Nonverbal Behavior in Interpersonal Relations , 1987 .