A Design Platform for Emotion-Aware User Interfaces

Machine recognition of emotion has become one of the main targets for developing next-generation user interaction in computer systems. While affect recognition technology has achieved substantial progress recently, the application of user emotion recognition to software user interface is in its early stages. In this paper, we describe the development of an emotion-aware user interface with a focus on visual appearance. Further, we propose an emotion-aware UI-authoring platform that helps designers create emotion-aware visual effects. In order to demonstrate its feasibility, we developed a prototype framework built with the authoring tool DAT4UX. The tool can integrate the resulting design into a mobile application equipped with emotion recognition facility. A proof-of-concept application featuring an emotion-aware interface is developed using the tool.

[1]  J. Alfredo Sánchez,et al.  Conveying mood and emotion in instant messaging by using a two-dimensional model for affective states , 2006, IHC '06.

[2]  Julie A. Jacko HCI Intelligent multimodal interaction environments , 2007 .

[3]  T. Wheatley,et al.  Music and movement share a dynamic structure that supports universal expressions of emotion , 2012, Proceedings of the National Academy of Sciences.

[4]  Christian Peter,et al.  Emotion representation and physiology assignments in digital systems , 2006, Interact. Comput..

[5]  Suh-Yin Lee,et al.  Emotion-based music recommendation by affinity discovery from film music , 2009, Expert Syst. Appl..

[6]  Etienne B. Roesch,et al.  A Blueprint for Affective Computing: A Sourcebook and Manual , 2010 .

[7]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[8]  C. Bartneck,et al.  Perception of affect elicited by robot motion , 2010, HRI 2010.

[9]  Stefan Kopp,et al.  A Conversational Agent as Museum Guide - Design and Evaluation of a Real-World Application , 2005, IVA.

[10]  Minjuan Wang,et al.  Affective e-Learning: Using "Emotional" Data to Improve Learning in Pervasive Learning Environment , 2009, J. Educ. Technol. Soc..

[11]  Ilkka Korhonen,et al.  Mobile Diary for Wellness Management—Results on Usage and Usability in Two User Studies , 2008, IEEE Transactions on Information Technology in Biomedicine.

[12]  Akira Ito,et al.  Artificial emotion expression for a robot by dynamic color change , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[13]  Ipke Wachsmuth,et al.  Affective computing with primary and secondary emotions in a virtual human , 2009, Autonomous Agents and Multi-Agent Systems.

[14]  Tek-Jin Nam,et al.  Emotional Interaction Through Physical Movement , 2007, HCI.

[15]  Yuan Wang,et al.  An Authoring Tool for Mobile Phone AR Environments , 2009 .

[16]  P. Petta,et al.  Computational models of emotion , 2010 .

[17]  Helen H. Epps,et al.  Color-emotion associations: Past experience and personal preference , 2004 .

[18]  Chuan-Kai Yang,et al.  Automatic Mood-Transferring between Color Images , 2008, IEEE Computer Graphics and Applications.