Recognition of Advertisement Emotions With Application to Computational Advertising

Advertisements (ads) often contain strong affective content to capture viewer attention and convey an effective message to the audience. However, most computational affect recognition (AR) approaches examine ads via the text modality, and only limited work has been devoted to decoding ad emotions from audiovisual or user cues. This work (1) compiles an affective ad dataset capable of evoking coherent emotions across users; (2) explores the efficacy of content-centric convolutional neural network (CNN) features for AR vis-a-vis handcrafted audio-visual descriptors; (3) examines user-centric ad AR from Electroencephalogram (EEG) responses acquired during ad-viewing, and (4) demonstrates how better affect predictions facilitate effective computational advertising as determined by a study involving 18 users. Experiments reveal that (a) CNN features outperform audiovisual descriptors for content-centric AR; (b) EEG features are able to encode ad-induced emotions better than content-based features; (c) Multi-task learning performs best among a slew of classification algorithms to achieve optimal AR, and (d) Pursuant to (b), EEG features also enable optimized ad insertion onto streamed video, as compared to content-based or manual insertion techniques in terms of ad memorability and overall user experience.

[1]  Tao Mei,et al.  VideoSense: towards effective online video advertising , 2007, ACM Multimedia.

[2]  Gerardo Hermosillo,et al.  Learning From Crowds , 2010, J. Mach. Learn. Res..

[3]  Y. Benjamini,et al.  Controlling the false discovery rate: a practical and powerful approach to multiple testing , 1995 .

[4]  Yoann Baveye Automatic prediction of emotions induced by movies , 2015 .

[5]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[6]  Mohammed Yeasin,et al.  Learning Representations from EEG with Deep Recurrent-Convolutional Neural Networks , 2015, ICLR.

[7]  Loong Fah Cheong,et al.  Affective understanding in film , 2006, IEEE Trans. Circuits Syst. Video Technol..

[8]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[9]  Cesare Furlanello,et al.  Deep learning for automatic stereotypical motor movement detection using wearable sensors in autism spectrum disorders , 2017, Signal Process..

[10]  Ioannis Patras,et al.  Fusion of facial expressions and EEG for implicit affective tagging , 2013, Image Vis. Comput..

[11]  Yuanliu Liu,et al.  Video-based emotion recognition using CNN-RNN and C3D hybrid networks , 2016, ICMI.

[12]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[13]  Reginald B. Adams,et al.  Probabilistic Multigraph Modeling for Improving the Quality of Crowdsourced Affective Data , 2017, IEEE Transactions on Affective Computing.

[14]  Harish Katti,et al.  Online Estimation of Evolving Human Visual Interest , 2014, TOMM.

[15]  Sebastian Stober,et al.  Using Convolutional Neural Networks to Recognize Rhythm Stimuli from Electroencephalography Recordings , 2014, NIPS.

[16]  Trevor Darrell,et al.  Caffe: Convolutional Architecture for Fast Feature Embedding , 2014, ACM Multimedia.

[17]  Mubarak Shah,et al.  Brain2Image: Converting Brain Signals into Images , 2017, ACM Multimedia.

[18]  Yongxin Yang,et al.  Trace Norm Regularised Deep Multi-Task Learning , 2016, ICLR.

[19]  Angeliki Metallinou,et al.  Annotation and processing of continuous emotional attributes: Challenges and opportunities , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[20]  J. Russell A circumplex model of affect. , 1980 .

[21]  Harish Katti,et al.  Making computers look the way we look: exploiting visual attention for image understanding , 2010, ACM Multimedia.

[22]  R. Dale Wilson,et al.  Television Programming and Its Influence on Viewers' Perceptions of Commercials: The Role of Program Arousal and Pleasantness , 1995 .

[23]  Abhinav Shukla,et al.  Evaluating content-centric vs. user-centric ad affect recognition , 2017, ICMI.

[24]  Nicu Sebe,et al.  Looking at the viewer: analysing facial activity to detect personal highlights of multimedia contents , 2010, Multimedia Tools and Applications.

[25]  P. Lang,et al.  Affective judgment and psychophysiological response: Dimensional covariation in the evaluation of pictorial stimuli. , 1989 .

[26]  Subramanian Ramanathan,et al.  Discovering gender differences in facial emotion recognition via implicit behavioral cues , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII).

[27]  Patrick De Pelsmacker,et al.  The influence of ad-evoked feelings on brand evaluations: : Empirical generalizations from consumer responses to more than 1000 TV commercials , 2013 .

[28]  Subramanian Ramanathan,et al.  DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses , 2015, IEEE Transactions on Affective Computing.

[29]  Yan Li,et al.  Injecting Principal Component Analysis with the OA Scheme in the Epileptic EEG Signal Classification , 2016 .

[30]  Bolei Zhou,et al.  Learning Deep Features for Scene Recognition using Places Database , 2014, NIPS.

[31]  D. O. Bos,et al.  EEG-based Emotion Recognition The Influence of Visual and Auditory Stimuli , 2007 .

[32]  Sebastian Stober,et al.  Deep Feature Learning for EEG Recordings , 2015, ArXiv.

[33]  Roddy Cowie,et al.  FEELTRACE: an instrument for recording perceived emotion in real time , 2000 .

[34]  Rada Mihalcea,et al.  DialogueRNN: An Attentive RNN for Emotion Detection in Conversations , 2018, AAAI.

[35]  Antonio Torralba,et al.  Modifying the Memorability of Face Photographs , 2013, 2013 IEEE International Conference on Computer Vision.

[36]  Samia Nefti-Meziani,et al.  Predicting the Valence of a Scene from Observers’ Eye Movements , 2015, PloS one.

[37]  Vijay Vasudevan,et al.  Learning Transferable Architectures for Scalable Image Recognition , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[38]  Pasin Israsena,et al.  EEG-Based Emotion Recognition Using Deep Learning Network with Principal Component Based Covariate Shift Adaptation , 2014, TheScientificWorldJournal.

[39]  P. Lang International affective picture system (IAPS) : affective ratings of pictures and instruction manual , 2005 .

[40]  Mohammad Mahdi Ghassemi,et al.  Predicting Latent Narrative Mood Using Audio and Physiologic Data , 2017, AAAI.

[41]  Yong Peng,et al.  EEG-based emotion classification using deep belief networks , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[42]  N. Sebe,et al.  Emotion modulates eye movement patterns and subsequent memory for the gist and details of movie scenes. , 2014, Journal of vision.

[43]  Harish Katti,et al.  CAVVA: Computational Affective Video-in-Video Advertising , 2014, IEEE Transactions on Multimedia.

[44]  Daniel Hernández-Lobato,et al.  A Probabilistic Model for Dirty Multi-task Feature Selection , 2015, ICML.

[45]  Emmanuel Dellandréa,et al.  LIRIS-ACCEDE: A Video Database for Affective Content Analysis , 2015, IEEE Transactions on Affective Computing.

[46]  Shrikanth S. Narayanan,et al.  Toward detecting emotions in spoken dialogs , 2005, IEEE Transactions on Speech and Audio Processing.

[47]  Marcus Liwicki,et al.  DeXpression: Deep Convolutional Neural Network for Expression Recognition , 2015, ArXiv.

[48]  Abhinav Shukla,et al.  Affect Recognition in Ads with Application to Computational Advertising , 2017, ACM Multimedia.

[49]  Stefan Winkler,et al.  A Probabilistic Approach to People-Centric Photo Selection and Sequencing , 2017, IEEE Transactions on Multimedia.

[50]  Mohammad Soleymani,et al.  Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User's Physiological Emotional Responses , 2008, 2008 Tenth IEEE International Symposium on Multimedia.

[51]  Yongzhao Zhan,et al.  Speech Emotion Recognition Using CNN , 2014, ACM Multimedia.

[52]  M. Holbrook,et al.  Assessing the Role of Emotions as Mediators of Consumer Responses to Advertising , 1987 .

[53]  Harish Katti,et al.  Object detection can be improved using human-derived contextual expectations , 2016, ArXiv.

[54]  J. Russell,et al.  Science Current Directions in Psychological the Structure of Current Affect : Controversies and Emerging Consensus on Behalf Of: Association for Psychological Science , 2022 .

[55]  S. Palazzo,et al.  Deep Learning Human Mind for Automated Visual Classification , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[56]  Stefan Winkler,et al.  ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors , 2018, IEEE Transactions on Affective Computing.

[57]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[58]  Mohammad Soleymani,et al.  Large-scale Affective Content Analysis: Combining Media Content Features and Facial Reactions , 2017, 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017).

[59]  Harish Katti,et al.  Interactive Video Advertising: A Multimodal Affective Approach , 2013, MMM.

[60]  Sebastian Stober,et al.  Learning discriminative features from electroencephalography recordings by encoding similarity constraints , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[61]  Ioannis Patras,et al.  AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups , 2017, IEEE Transactions on Affective Computing.

[62]  Harish Katti,et al.  An Eye Fixation Database for Saliency Detection in Images , 2010, ECCV.