Design and implementation of an affect-responsive interactive photo frame

This paper describes an affect-responsive interactive photo-frame application that offers its user a different experience with every use. It relies on visual analysis of activity levels and facial expressions of its users to select responses from a database of short video segments. This ever-growing database is automatically prepared by an offline analysis of user-uploaded videos. The resulting system matches its user’s affect along dimensions of valence and arousal, and gradually adapts its response to each specific user. In an extended mode, two such systems are coupled and feed each other with visual content. The strengths and weaknesses of the system are assessed through a usability study, where a Wizard-of-Oz response logic is contrasted with the fully automatic system that uses affective and activity-based features, either alone, or in tandem.

[1]  Albert Ali Salah,et al.  An Affect-Responsive Interactive Photo Frame , 2010 .

[2]  Björn W. Schuller,et al.  A demonstration of audiovisual sensitive artificial listeners , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[3]  Jie Cao,et al.  PAD Model Based Facial Expression Analysis , 2008, ISVC.

[4]  Hatice Gunes,et al.  Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[5]  David R. Bull,et al.  Projective image restoration using sparsity regularization , 2013, 2013 IEEE International Conference on Image Processing.

[6]  J. Russell A circumplex model of affect. , 1980 .

[7]  Thomas S. Huang,et al.  Connected vibrations: a modal analysis approach for non-rigid motion tracking , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[8]  Richard Bowden,et al.  Real-time motion control using pose space probability density estimation , 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.

[9]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[10]  S. Agamanolis Beyond Communication: Human Connectedness as a Research Agenda , 2006 .

[11]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[12]  Vladimir Pavlovic,et al.  Real-Time Vision for Human-Computer Interaction , 2010 .

[13]  L. A. Pervin Handbook of Personality: Theory and Research , 1992 .

[14]  Fred L. Bookstein,et al.  Principal Warps: Thin-Plate Splines and the Decomposition of Deformations , 1989, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  D. Kort,et al.  D3.3 : Game Experience Questionnaire:development of a self-report measure to assess the psychological impact of digital games , 2007 .

[16]  Anton Nijholt,et al.  Toward Affective Dialogue Modeling using Partially Observable Markov Decision Processes , 2006 .

[17]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[18]  Rainer Lienhart,et al.  An extended set of Haar-like features for rapid object detection , 2002, Proceedings. International Conference on Image Processing.

[19]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[20]  Peter Robinson,et al.  Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[21]  Marc Cavazza,et al.  An affective model of user experience for interactive art , 2008, ACE '08.

[22]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[23]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[24]  Nicu Sebe,et al.  Authentic facial expression analysis , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[25]  Marc Schröder,et al.  The SEMAINE API: Towards a Standards-Based Framework for Building Emotion-Oriented Systems , 2010, Adv. Hum. Comput. Interact..

[26]  Matei Mancas,et al.  MORFACE: FACE MORPHING , 2009 .

[27]  C. Carver,et al.  Behavioral inhibition, behavioral activation, and affective responses to impending reward and punishment: The BIS/BAS Scales , 1994 .

[28]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[29]  Panos Markopoulos,et al.  The PhotoMirror appliance: affective awareness in the hallway , 2006, Personal and Ubiquitous Computing.

[30]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  N. Sebe,et al.  Facial Expression Recognition: A Fully Integrated Approach , 2007, 14th International Conference of Image Analysis and Processing - Workshops (ICIAPW 2007).

[32]  Shaogang Gong,et al.  Beyond Facial Expressions: Learning Human Emotion from Body Gestures , 2007, BMVC.