Tag your emotions: a novel mobile user interface for annotating images with emotions

People tend to collect more and more data, this is especially true for images on mobile devices. Tagging images is a good way to sort such collections. While automatic tagging systems are often focused on the content, such as objects or persons in the image, manual annotations are very important to describe the context of an image. Often especially emotions are important, e.g., when a person reflects a situation, shows images from a very personal collection to others, or when using images to illustrate presentations. Unfortunately, manual annotation is often very boring and users are not very motivated to do so. While there are many approaches to motivate people to annotate data in a conventional way, none of them has focused on emotions. In this poster abstract, we present EmoWheel; an innovative interface to annotate images with emotional tags. We conducted a user study with 18 participants. Results show that the EmoWheel can enhance the motivation to annotate images.

[1]  Peter W. Johnson,et al.  Are there Differences in Force Exposures and Typing Productivity between Touchscreen and Conventional Keyboard? , 2012 .

[2]  Bolei Zhou,et al.  Learning Deep Features for Scene Recognition using Places Database , 2014, NIPS.

[3]  Jakub Simko,et al.  Personal image tagging: a game-based approach , 2012, I-SEMANTICS '12.

[4]  R. Plutchik Emotions : a general psychoevolutionary theory , 1984 .

[5]  Ansgar Scherp,et al.  Semiotic Tagging: Enriching the Semantics of Tags for Improved Image Retrieval , 2014, 2014 IEEE International Conference on Semantic Computing.

[6]  Mohammad Soleymani,et al.  Human-centered implicit tagging: Overview and perspectives , 2012, 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[7]  Teresa Romão,et al.  On Sounder Ground: CAAT, a Viable Widget for Affective Reaction Assessment , 2015, UIST.

[8]  Nina Runge,et al.  Keep an eye on your photos: automatic image tagging on mobile devices , 2014, MobileHCI '14.

[9]  Wolfgang G. Stock,et al.  Collective indexing of emotions in images. A study in emotional information retrieval , 2009 .

[10]  David Fonseca,et al.  An image-centred "search and indexation system" based in user's data and perceived emotion , 2008, HCC '08.

[11]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[12]  Giulia Boato,et al.  Emotion based classification of natural images , 2011, DETECT '11.

[13]  Mor Naaman,et al.  Why we tag: motivations for annotation in mobile and online media , 2007, CHI.

[14]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[15]  Erik Cambria,et al.  Sentic Album: Content-, Concept-, and Context-Based Online Personal Photo Management System , 2012, Cognitive Computation.