SmileCityReport: Emotion-aware Participatory Sensing for Smart Cities with Double-sided Photo Shooting

Collection of information of the events taking place in local neighborhoods along with the emotional statuses of the people involved can enable us to realize an “affective smart city map”, with which, for example, the local authority can review the measures adopted for the local areas and whether their these measure have actually contributed to the quality of life (QoL) and well-being of the people. To realize such an information system having easy-deployability, real-time and secure protection of user's sensitive data, we propose SmileCityReport, a smartphone app-based participatory sensing that can easily capture both the city events and the reporter's emotion-related status based on a novel technique that uses two cameras simultaneously. For our evaluation, we evaluated 15 users over one week and confirmed that the proposed methodology contributes to more activity and (estimated) more positive emotional status of the users, and also that the emotion-related facial expression values constitute valuable data that can be publicly shared.

[1]  Mirco Musolesi,et al.  InterruptMe: designing intelligent prompting mechanisms for pervasive applications , 2014, UbiComp.

[2]  Yuuki Nishiyama,et al.  Poster: Extensive Evaluation of Emotional Contagion on Smiling Selfies over Social Network , 2017, MobiSys.

[3]  Olga Sourina,et al.  Real-Time EEG-Based Human Emotion Recognition and Visualization , 2010, 2010 International Conference on Cyberworlds.

[4]  Tadashi Okoshi,et al.  Real-World Product Deployment of Adaptive Push Notification Scheduling on Smartphones , 2019, KDD.

[5]  Yunxin Liu,et al.  MoodScope: building a mood sensor from smartphone usage patterns , 2013, MobiSys '13.

[6]  B. Prabhakaran,et al.  Real-Time Facial Expression Recognition on Smartphones , 2015, 2015 IEEE Winter Conference on Applications of Computer Vision.

[7]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[8]  Jing Cai,et al.  The Research on Emotion Recognition from ECG Signal , 2009, 2009 International Conference on Information Technology and Computer Science.

[9]  Tadashi Okoshi,et al.  Interruptibility Map: Geographical analysis of users' interruptibility in smart cities , 2017, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops).

[10]  JeongGil Ko,et al.  WellComp 2018: First International Workshop on Computing for Well-Being , 2018, UbiComp/ISWC Adjunct.

[11]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .