Hybrid Avatar-Agent Technology – A Conceptual Step Towards Mediated “Social” Virtual Reality and its Respective Challenges

Abstract Driven by large industry investments, developments of Virtual Reality (VR) technologies including unobtrusive sensors, actuators and novel display devices are rapidly progressing. Realism and interactivity have been postulated as crucial aspects of immersive VR since the naissance of the concept. However, today’s VR still falls short from creating real life-like experiences in many regards. This holds particularly true when introducing the “social dimension” into the virtual worlds. Apparently, creating convincing virtual selves and virtual others and conveying meaningful and appropriate social behavior still is an open challenge for future VR. This challenge implies both, technical aspects, such as the real-time capacities of the systems, but also psychological aspects, such as the dynamics of human communication. Our knowledge of VR systems is still fragmented with regard to social cognition, although the social dimension is crucial when aiming at autonomous agents with a certain social background intelligence. It can be questioned though whether a perfect copy of real life interactions is a realistic or even meaningful goal of social VR development at this stage. Taking into consideration the specific strengths and weaknesses of humans and machines, we propose a conceptual turn in social VR which focuses on what we call “hybrid avatar-agent systems”. Such systems are required to generate i) avatar mediated interactions between real humans, taking advantage of their social intuitions and flexible communicative skills and ii) an artificial social intelligence (AIS) which monitors, and potentially moderates or transforms ongoing virtual interactions based on social signals, such as performing adaptive manipulations of behavior in intercultural conversations. The current article sketches a respective base architecture and discusses necessary research prospects and challenges as a starting point for future research and development.

[1]  M. Shiffrar,et al.  The visual perception of human locomotion. , 1998, Cognitive neuropsychology.

[2]  J. Walther,et al.  Interpersonal and Hyperpersonal Dimensions of Computer‐Mediated Communication , 2015 .

[3]  Wolfgang A. Halang,et al.  Real-time Systems' Quality of Service - Introducing Quality of Service Considerations in the Life Cycle of Real-time Systems , 2010 .

[4]  H. Bekkering,et al.  Joint action: bodies and minds moving together , 2006, Trends in Cognitive Sciences.

[5]  Tobias Baur,et al.  The social signal interpretation (SSI) framework: multimodal signal processing and recognition in real-time , 2013, ACM Multimedia.

[6]  Marc Erich Latoschik,et al.  Profiling and benchmarking event- and message-passing-based asynchronous realtime interactive systems , 2014, VRST '14.

[7]  Gary Bente,et al.  Sex Differences in Body Movement and Visual Attention: An Integrated Analysis of Movement and Gaze in Mixed-Sex Dyads , 1998 .

[8]  Michiel M. A. Spapé,et al.  Keep Your Opponents Close: Social Context Affects EEG and fEMG Linkage in a Turn-Based Computer Game , 2013, PloS one.

[9]  Siegfried Frey Die nonverbale Kommunikation , 1984 .

[10]  Gary Bente,et al.  The others: Universals and cultural specificities in the perception of status and dominance from nonverbal behavior , 2010, Consciousness and Cognition.

[11]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[12]  Guillaume Chanel,et al.  Physiological compliance for social gaming analysis: Cooperative versus competitive play , 2012, Interact. Comput..

[13]  Ronald E. Riggio,et al.  NONVERBAL SKILLS AND ABILITIES , 2006 .

[14]  Laurel D. Riek,et al.  Wizard of Oz studies in HRI , 2012, J. Hum. Robot Interact..

[15]  Maja Pantic,et al.  Social signal processing: Survey of an emerging domain , 2009, Image Vis. Comput..

[16]  Marc Erich Latoschik,et al.  Semantic modelling for virtual worlds a novel paradigm for realtime interactive systems? , 2008, VRST '08.

[17]  Curtis L. Baker,et al.  Eccentricity-dependent scaling of the limits for short-range apparent motion perception , 1985, Vision Research.

[18]  Alessandro Tognetti,et al.  Interpreting Psychophysiological States Using Unobtrusive Wearable Sensors in Virtual Reality , 2014, ACHI 2014.

[19]  Chris McKay Is there anybody out there? , 2010, Nature.

[20]  David Ewins,et al.  The Emotiv EPOC neuroheadset: an inexpensive method of controlling assistive technologies using facial expressions and thoughts? , 2011 .

[21]  David Matsumoto,et al.  CULTURE AND NONVERBAL BEHAVIOR , 2006 .

[22]  Kai Vogeley,et al.  A Non-Verbal Turing Test: Differentiating Mind from Machine in Gaze-Based Social Interaction , 2011, PloS one.

[23]  Martin Fischbach,et al.  Engineering Variance: Software Techniques for Scalable, Customizable, and Reusable Multimodal Processing , 2014, HCI.

[24]  Kai Vogeley,et al.  Why we interact: On the functional role of the striatum in the subjective experience of social interaction , 2014, NeuroImage.

[25]  T. Chartrand,et al.  Using Nonconscious Behavioral Mimicry to Create Affiliation and Rapport , 2003, Psychological science.

[26]  Mel Slater,et al.  Grand Challenges in Virtual Environments , 2014, Front. Robot. AI.

[27]  Marc Erich Latoschik A user interface framework for multimodal VR interactions , 2005, ICMI '05.

[28]  M. Black Avatars , 2008, BMJ : British Medical Journal.

[29]  Lukasz Ziarek,et al.  RTDroid: A Design for Real-Time Android , 2016, IEEE Transactions on Mobile Computing.

[30]  Marc Erich Latoschik Smart Graphics/Intelligent Graphics , 2013, Informatik-Spektrum.

[31]  Kai Vogeley,et al.  Eyes on the Mind: Investigating the Influence of Gaze Dynamics on the Perception of Others in Real-Time Social Interaction , 2012, Front. Psychology.

[32]  R. Wiseman,et al.  The effects of verbal and nonverbal teacher immediacy on perceived cognitive, affective, and behavioral learning in the multicultural classroom , 1990 .

[33]  Oli Mival,et al.  Wizard of Oz experiments for companions , 2009, BCS HCI.

[34]  C. Neuper,et al.  Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges , 2010, Front. Neurosci..

[35]  J. Bailenson,et al.  Walk A Mile in Digital Shoes : The Impact of Embodied Perspective-Taking on The Reduction of Negative Stereotyping in Immersive Virtual Environments , 2006 .

[36]  Nadim Joni Shah,et al.  Duration matters: Dissociating neural correlates of detection and evaluation of social gaze , 2009, NeuroImage.

[37]  Jeremy N. Bailenson,et al.  Transformed Social Interaction: Decoupling Representation from Behavior and Form in Collaborative Virtual Environments , 2004, Presence: Teleoperators & Virtual Environments.

[38]  Nicole C. Krämer,et al.  Virtual Gaze. A Pilot Study on the Effects of Computer Simulated Gaze in Avatar-Based Conversations , 2007, HCI.

[39]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[40]  Lucas A. Keefer Is There Anybody Out There? Trait Anthropomorphism Predicts the Psychological Benefits of a Favorite Belonging , 2016 .

[41]  Kay M. Stanney,et al.  Handbook of Virtual Environments - Design, Implementation, and Applications, Second Edition , 2014, Handbook of Virtual Environments, 2nd ed..

[42]  D. Lakens,et al.  If They Move in Sync, They Must Feel in Sync: Movement Synchrony Leads to Attributions of Rapport and Entitativity , 2011 .

[43]  Line Garnero,et al.  Inter-Brain Synchronization during Social Interaction , 2010, PloS one.

[44]  Yukiko I. Nakano,et al.  Investigating culture-related aspects of behavior for virtual characters , 2013, Autonomous Agents and Multi-Agent Systems.

[45]  Marc Erich Latoschik,et al.  Avatar anthropomorphism and illusion of body ownership in VR , 2015, 2015 IEEE Virtual Reality (VR).