Non-isomorphic Interaction Techniques for Controlling Avatar Facial Expressions in VR

The control of an avatar’s facial expressions in virtual reality is mainly based on the automated recognition and transposition of the user’s facial expressions. These isomorphic techniques are limited to what users can convey with their own face and have recognition issues. To overcome these limitations, non-isomorphic techniques rely on interaction techniques using input devices to control the avatar’s facial expressions. Such techniques need to be designed to quickly and easily select and control an expression, and not disrupt a main task such as talking. We present the design of a set of new non-isomorphic interaction techniques for controlling an avatar facial expression in VR using a standard VR controller. These techniques have been evaluated through two controlled experiments to help designing an interaction technique combining the strengths of each approach. This technique was evaluated in a final ecological study showing it can be used in contexts such as social applications.

[1]  R. Plutchik Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice , 2016 .

[2]  Marc Erich Latoschik,et al.  FaceBo: Real-time face and body tracking for faithful avatar synthesis , 2016, 2016 IEEE Virtual Reality (VR).

[3]  Michael Burmester,et al.  AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität , 2003, MuC.

[4]  Wendy E. Mackay,et al.  MojiBoard: Generating Parametric Emojis with Gesture Keyboards , 2019, CHI Extended Abstracts.

[5]  Catherine Pelachaud,et al.  From Greta's mind to her face: modelling the dynamics of affective states in a conversational embodied agent , 2003, Int. J. Hum. Comput. Stud..

[6]  P. Ekman,et al.  Facial action coding system , 2019 .

[7]  Ramakrishnan Mukundan,et al.  Expressive MPEG-4 Facial Animation Using Quadratic Deformation Models , 2010, 2010 Seventh International Conference on Computer Graphics, Imaging and Visualization.

[8]  R. Plutchik The Nature of Emotions , 2001 .

[9]  Mark Pauly,et al.  Realtime performance-based facial animation , 2011, ACM Trans. Graph..

[10]  Jessica K. Hodgins,et al.  Using an Interactive Avatar's Facial Expressiveness to Increase Persuasiveness and Socialness , 2015, CHI.

[11]  Markus Funk,et al.  Investigating Social Presence and Communication with Embodied Avatars in Room-Scale Virtual Reality , 2017, iLRN.

[12]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[13]  Clemens Sielaff EmoCoW: an interface for real-time facial animation , 2010, SA '10.

[14]  Michael Rohs,et al.  EmojiZoom: emoji entry via large overview maps 😄🔍 , 2016, MobileHCI.

[15]  Soo Youn Oh,et al.  Let the Avatar Brighten Your Smile: Effects of Enhancing Facial Expressions in Virtual Environments , 2016, PloS one.

[16]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[17]  Yuta Sugiura,et al.  Facial Expression Mapping inside Head Mounted Display by Embedded Optical Sensors , 2016, UIST.

[18]  R. Plutchik A GENERAL PSYCHOEVOLUTIONARY THEORY OF EMOTION , 1980 .

[19]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[20]  Chongyang Ma,et al.  Facial performance sensing head-mounted display , 2015, ACM Trans. Graph..

[21]  Bernhard Bittorf CogVis,et al.  EmotiCon Interactive emotion control for virtual characters , 2012 .

[22]  Algirdas Pakstas,et al.  MPEG-4 Facial Animation: The Standard,Implementation and Applications , 2002 .