Fusing On-Body Sensing with Local and Temporal Cues for Daily Activity Recognition

Automatically recognizing people's daily activities is essential for a variety of applications, such as just-in-time content delivery or quantified self-tracking. Towards this, researchers often use customized wearable motion sensors tailored to recognize a small set of handpicked activities in controlled environments. In this paper, we design and engineer a scalable, daily activity recognition framework, by leveraging two widely adopted commercial devices: Android smartphone and Pebble smartwatch. Deploying our system outside the laboratory, we collected a total of more than 72 days of data from 12 user study participants. We systematically show the usefulness of time, location, and wrist-based motion for automatically recognizing 10 standardized activities, as specified by the American Time Use Survey taxonomy. Overall, we achieve a recognition accuracy of 76.28% for personalized models and 69.80% for generic, interpersonal models.

[1]  Thomas Phan,et al.  Sensor fusion of physical and social data using Web SocialSense on smartphone mobile browsers , 2014, 2014 IEEE 11th Consumer Communications and Networking Conference (CCNC).

[2]  Dieter Fox,et al.  Location-Based Activity Recognition , 2005, KI.

[3]  Henry A. Kautz,et al.  Learning and inferring transportation routines , 2004, Artif. Intell..

[4]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[5]  K. Shelley Developing the American Time Use Survey activity classification system , 2005 .

[6]  Alex Pentland,et al.  Social fMRI: Investigating and shaping social mechanisms in the real world , 2011, Pervasive Mob. Comput..

[7]  Paul Lukowicz,et al.  All for one or one for all? Combining heterogeneous features for activity spotting , 2010, 2010 8th IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops).

[8]  Bernt Schiele,et al.  An Analysis of Sensor-Oriented vs. Model-Based Activity Recognition , 2009, 2009 International Symposium on Wearable Computers.

[9]  Philippe Golle,et al.  On using existing time-use study data for ubiquitous computing applications , 2008, UbiComp.

[10]  F. Ichikawa,et al.  Where's The Phone? A Study of Mobile Phone Location in Public Spaces , 2005, 2005 2nd Asia Pacific Conference on Mobile Technology, Applications and Systems.

[11]  Paul Lukowicz,et al.  Capturing crowd dynamics at large scale events using participatory GPS-localization , 2014, 2014 IEEE Ninth International Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP).

[12]  Kristof Van Laerhoven,et al.  Using time use with mobile sensor data: a road to practical mobile activity recognition? , 2013, MUM.

[13]  Bernt Schiele,et al.  Using rhythm awareness in long-term activity recognition , 2008, 2008 12th IEEE International Symposium on Wearable Computers.

[14]  Prem Prakash Jayaraman,et al.  Using On-the-Move Mining for Mobile Crowdsensing , 2012, 2012 IEEE 13th International Conference on Mobile Data Management.

[15]  Bernt Schiele,et al.  Daily Routine Recognition through Activity Spotting , 2009, LoCA.

[16]  Kristof Van Laerhoven,et al.  Improving activity recognition without sensor data: a comparison study of time use surveys , 2013, AH.

[17]  Emiliano Miluzzo,et al.  CenceMe - Injecting Sensing Presence into Social Networking Applications , 2007, EuroSSC.

[18]  Gregory D. Abowd,et al.  Farther Than You May Think: An Empirical Investigation of the Proximity of Users to Their Mobile Phones , 2006, UbiComp.

[19]  Emiliano Miluzzo,et al.  A survey of mobile phone sensing , 2010, IEEE Communications Magazine.

[20]  Matthias Budde,et al.  ActiServ: Activity Recognition Service for mobile phones , 2010, International Symposium on Wearable Computers (ISWC) 2010.

[21]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.