Collaborative filtering with facial expressions for online video recommendation

We propose a procedure for online video recommendation based on users facial expressions.The proposed procedure addresses a user's preference changes often observed while the user watches a video.The proposed procedure addresses the new user problem.Experiment results show that the proposed procedure produces better prediction accuracy than other benchmark systems. Online video recommender systems help users find videos suitable for their preferences. However, they have difficulty in identifying dynamic user preferences. In this study, we propose a new recommendation procedure using changes of users' facial expressions captured every moment. Facial expressions portray the users' actual emotions about videos. We can utilize them to discover dynamic user preferences. Further, because the proposed procedure does not rely on historical rating or purchase records, it properly addresses the new user problem, that is, the difficulty in recommending products to users whose past rating or purchase records are not available. To validate the recommendation procedure, we conducted experiments with footwear commercial videos. Experiment results show that the proposed procedure outperforms benchmark systems including a random recommendation, an average rating approach, and a typical collaborative filtering approach for recommendation to both new and existing users. From the results, we conclude that facial expressions are a viable element in recommendation.

[1]  Erin B. McClure-Tone,et al.  Behavioral and Neural Representation of Emotional Facial Expressions Across the Lifespan , 2011, Developmental neuropsychology.

[2]  Hongxun Yao,et al.  Video classification and recommendation based on affective analysis of viewers , 2013, Neurocomputing.

[3]  Takeo Kanade,et al.  Detection, tracking, and classification of action units in facial expression , 2000, Robotics Auton. Syst..

[4]  Kristin Prehn,et al.  Intranasal oxytocin enhances emotion recognition from dynamic facial expressions and leaves eye-gaze unaffected , 2012, Psychoneuroendocrinology.

[5]  J. M. Carroll,et al.  Facial Expressions in Hollywood's Portrayal of Emotion , 1997 .

[6]  Hao Jiang,et al.  Personalized online document, image and video recommendation via commodity eye-tracking , 2008, RecSys '08.

[7]  Ramesh K. Sitaraman,et al.  Video Stream Quality Impacts Viewer Behavior: Inferring Causality Using Quasi-Experimental Designs , 2013, IEEE/ACM Transactions on Networking.

[8]  Qiang Ji,et al.  Facial Action Unit Recognition by Exploiting Their Dynamic and Semantic Relationships , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Charles D. Smith,et al.  Neural substrates of facial emotion processing using fMRI. , 2001, Brain research. Cognitive brain research.

[10]  Shaogang Gong,et al.  Facial expression recognition based on Local Binary Patterns: A comprehensive study , 2009, Image Vis. Comput..

[11]  Yu He,et al.  The YouTube video recommendation system , 2010, RecSys '10.

[12]  Joemon M. Jose,et al.  Integrating facial expressions into user profiling for the improvement of a multimodal recommender system , 2009, 2009 IEEE International Conference on Multimedia and Expo.

[13]  R. Adolphs,et al.  Cortical Systems for the Recognition of Emotion in Facial Expressions , 1996, The Journal of Neuroscience.

[14]  Ganesh K. Venayagamoorthy,et al.  Recognition of facial expressions using Gabor wavelets and learning vector quantization , 2008, Eng. Appl. Artif. Intell..

[15]  Ioannis Pitas,et al.  Texture and shape information fusion for facial expression and facial action unit recognition , 2008, Pattern Recognit..

[16]  P. Ekman,et al.  Facial Expressions of Emotion , 1979 .

[17]  Frank Hopfgartner,et al.  Semantic User Modelling for Personal News Video Retrieval , 2010, MMM.

[18]  Sotiris Malassiotis,et al.  Real-time 2D+3D facial action and expression recognition , 2010, Pattern Recognit..

[19]  Qijun Zhao,et al.  Facial expression recognition on multiple manifolds , 2011, Pattern Recognit..

[20]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[21]  Meng Su,et al.  Recommendation and repurchase intention thresholds: A joint heterogeneity response estimation , 2009 .

[22]  Takeo Kanade,et al.  Recognizing Action Units for Facial Expression Analysis , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[23]  Jonghun Park,et al.  Online Video Recommendation through Tag-Cloud Aggregation , 2011, IEEE MultiMedia.

[24]  C. Ross,et al.  Sex differences in distress: Real or artifact? , 1995 .

[25]  Tao Mei,et al.  VideoReach: an online video recommendation system , 2007, SIGIR.

[26]  Arman Savran,et al.  Regression-based intensity estimation of facial action units , 2012, Image Vis. Comput..

[27]  John Riedl,et al.  Item-based collaborative filtering recommendation algorithms , 2001, WWW '01.