An analysis of smartphone overuse recognition in terms of emotions using brainwaves and deep learning

Abstract The overuse of smartphones is increasingly becoming a social problem. In this paper, we analyze smartphone overuse levels, according to emotion, by examining brainwaves and deep learning. We assessed the asymmetry power with respect to theta, alpha, beta, gamma, and total brainwave activity in 11 lobes. The deep belief network (DBN) was used as the deep learning method, along with k-nearest neighbor (kNN) and a support vector machine (SVM), to determine the smartphone addiction level. The risk group (13 subjects) and non-risk group (12 subjects) watched videos portraying the following concepts: relaxed, fear, joy, and sadness. We found that the risk group was more emotionally unstable than the non-risk group. In recognizing Fear, a clear difference appeared between the risk and non-risk group. The results showed that the gamma band was the most obviously different between the risk and non-risk groups. Moreover, we demonstrated that the measurements of activity in the frontal, parietal, and temporal lobes were indicators of emotion recognition. Through the DBN, we confirmed that these measurements were more accurate in the non-risk group than they were in the risk group. The risk group had higher accuracy in low valence and arousal; on the other hand, the non-risk group had higher accuracy in high valence and arousal.

[1]  Olga Sourina,et al.  Real-Time EEG-Based Emotion Recognition and Its Applications , 2011, Trans. Comput. Sci..

[2]  Maya Samaha Rupert,et al.  Relationships among smartphone addiction, stress, academic performance, and satisfaction with life , 2016, Comput. Hum. Behav..

[3]  Shao-I Chiu,et al.  The relationship between life stress and smartphone addiction on taiwanese university student: A mediation model of learning self-Efficacy and social self-Efficacy , 2014, Comput. Hum. Behav..

[4]  Hyunna Kim Exercise rehabilitation for smartphone addiction , 2013, Journal of exercise rehabilitation.

[5]  Leontios J. Hadjileontiadis,et al.  A Novel Emotion Elicitation Index Using Frontal Brain Asymmetry for Enhanced EEG-Based Emotion Recognition , 2011, IEEE Transactions on Information Technology in Biomedicine.

[6]  John J. B. Allen,et al.  Frontal EEG asymmetry as a moderator and mediator of emotion , 2004, Biological Psychology.

[7]  Asli Enez Darcin,et al.  Smartphone addiction and its relationship with social anxiety and loneliness , 2016, Behav. Inf. Technol..

[8]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[9]  Dongil Kim,et al.  Development of Korean Smartphone Addiction Proneness Scale for Youth , 2014, PloS one.

[10]  Peter W. McOwan,et al.  A real-time automated system for the recognition of human facial expressions , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[11]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.

[12]  Geoffrey E. Hinton,et al.  Learning a better representation of speech soundwaves using restricted boltzmann machines , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[13]  N. Fox,et al.  Asymmetrical brain activity discriminates between positive and negative affective stimuli in human infants. , 1982, Science.

[14]  Jeffrey B. Henriques,et al.  Left frontal hypoactivation in depression. , 1991, Journal of abnormal psychology.

[15]  G. Knyazev,et al.  Gender differences in implicit and explicit processing of emotional facial expressions as revealed by event-related theta synchronization. , 2010, Emotion.

[16]  R. Davidson,et al.  Anterior electrophysiological asymmetries, emotion, and depression: conceptual and methodological conundrums. , 1998, Psychophysiology.

[17]  Bao-Liang Lu,et al.  Revealing critical channels and frequency bands for emotion recognition from EEG with deep belief network , 2015, 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER).

[18]  Dong Yu,et al.  Investigation of full-sequence training of deep belief networks for speech recognition , 2010, INTERSPEECH.

[19]  Larry A. Rendell,et al.  The Feature Selection Problem: Traditional Methods and a New Algorithm , 1992, AAAI.

[20]  Pasin Israsena,et al.  EEG-Based Emotion Recognition Using Deep Learning Network with Principal Component Based Covariate Shift Adaptation , 2014, TheScientificWorldJournal.

[21]  Yong Peng,et al.  EEG-based emotion classification using deep belief networks , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[22]  Bao-Liang Lu,et al.  EEG-based emotion recognition during watching movies , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[23]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[24]  Yongzhao Zhan,et al.  Maximum Neighborhood Margin Discriminant Projection for Classification , 2014, TheScientificWorldJournal.

[25]  M. Grigutsch,et al.  Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. , 2007, Psychophysiology.

[26]  Yoshua Bengio,et al.  Convolutional networks for images, speech, and time series , 1998 .

[27]  T. Baumgartner,et al.  From emotion perception to emotion experience: emotions evoked by pictures and classical music. , 2006, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[28]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[29]  Slim Essid,et al.  Assessment of new spectral features for eeg-based emotion recognition , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[30]  Michael J. Black,et al.  Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion , 1997, International Journal of Computer Vision.

[31]  Johan Hagelbäck,et al.  Evaluating Classifiers for Emotion Recognition Using EEG , 2013, HCI.

[32]  T. Jung,et al.  Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening , 2014, Front. Neurosci..

[33]  Bao-Liang Lu,et al.  Emotional state classification from EEG data using machine learning approach , 2014, Neurocomputing.

[34]  L. Williams,et al.  Investigating models of affect: relationships among EEG alpha asymmetry, depression, and anxiety. , 2008, Emotion.

[35]  Valery A. Petrushin,et al.  EMOTION IN SPEECH: RECOGNITION AND APPLICATION TO CALL CENTERS , 1999 .

[36]  Julian F Thayer,et al.  Heart rate response is longer after negative emotions than after positive emotions. , 2003, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[37]  Bao-Liang Lu,et al.  Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks , 2015, IEEE Transactions on Autonomous Mental Development.