Event Cognition-based Daily Activity Prediction Using Wearable Sensors

Learning from human behaviors in the real world is essential for human-aware intelligent systems such as smart assistants and autonomous robots. Most of research focuses on correlations between sensory patterns and a label for each activity. However, human activity is a combination of several event contexts and is a narrative story in and of itself. We propose a novel approach of human activity prediction based on event cognition. Egocentric multi-sensor data are collected from an individual's daily life by using a wearable device and smartphone. Event contexts about location, scene and activities are then recognized, and finally the users" daily activities are predicted from a decision rule based on the event contexts. The proposed method has been evaluated on a wearable sensor data collected from the real world over 2 weeks by 2 people. Experimental results showed improved recognition accuracies when using the proposed method comparing to results directly using sensory features.

[1]  S. Marshall,et al.  An ethical framework for automated, wearable cameras in health behavior research. , 2013, American journal of preventive medicine.

[2]  A. Pentland,et al.  Eigenbehaviors: identifying structure in routine , 2009, Behavioral Ecology and Sociobiology.

[3]  John Krumm,et al.  Days of Our Lives: Assessing Day Similarity from Location Traces , 2013, UMAP.

[4]  Shahram Izadi,et al.  SenseCam: A Retrospective Memory Aid , 2006, UbiComp.

[5]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[6]  Brian Patrick Clarkson,et al.  Life patterns : structure from wearable sensors , 2002 .

[7]  Andrea Vedaldi,et al.  MatConvNet: Convolutional Neural Networks for MATLAB , 2014, ACM Multimedia.

[8]  G. O'loughlin,et al.  Using a wearable camera to increase the accuracy of dietary analysis. , 2013, American journal of preventive medicine.

[9]  Mingui Sun,et al.  Recognizing physical activity from ego-motion of a camera , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[10]  Gregory D. Abowd,et al.  Predicting daily activities from egocentric images using deep learning , 2015, SEMWEB.

[11]  Gordon Cheng,et al.  Transferring skills to humanoid robots by extracting semantic representations from observations of human activities , 2017, Artif. Intell..

[12]  Martin L. Griss,et al.  Nonparametric discovery of human routines from sensor data , 2014, 2014 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[13]  Anind K. Dey,et al.  Parent-driven use of wearable cameras for autism support: a field study with families , 2012, UbiComp.