Using Mobile Sensing on Smartphones for the Management of Daily Life Tasks

Today, all smartphones contain a variety of embedded sensors capable of monitoring and measuring relevant physical qualities and quantities, such as light or noise intensity, rotation and acceleration, magnetic field, humidity, etc. Combining data from these different sensors and deriving new practical information is the way to enhance the capabilities of such sensors, known as sensor fusion or multimodal sensing. However, the authors hypothesize that the sensing technology that is embedded in smartphones may also support daily life task management. Because one of the biggest challenges in mobile sensing on smartphones is the lack of appropriate unified data analysis models and common software toolkits, the authors have developed a prototype for a mobile sensing architecture, called Sensing Things Done (STD). With this prototype, by applying multimodal sensing and gathering sensor data from performing a specific set of tasks, the authors were able to conduct a feasibility study to investigate the hypothesis set above. Having examined to what extent the task-related activities could be detected automatically by using sensors of a standard smartphone, the authors of this chapter describe the conducted study and provide derived recommendations.

[1]  Wazir Zada Khan,et al.  Mobile Phone Sensing Systems: A Survey , 2013, IEEE Communications Surveys & Tutorials.

[2]  Ian H. Witten,et al.  Chapter 10 – Deep learning , 2017 .

[3]  D. Allen Getting Things Done: The Art of Stress-Free Productivity , 2001 .

[4]  Wen Hu,et al.  Ear-phone: an end-to-end participatory urban noise mapping system , 2010, IPSN '10.

[5]  Karl Aberer,et al.  ExposureSense: Integrating daily activities with air quality using mobile participatory sensing , 2013, 2013 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops).

[6]  Javier Bajo,et al.  A review of mobile sensing systems, applications, and opportunities , 2019, Knowledge and Information Systems.

[7]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.

[8]  Jie Liu,et al.  Enabling energy efficient continuous sensing on mobile phones with LittleRock , 2010, IPSN '10.

[9]  Bahram Honary,et al.  A Sensor Fusion Method for Smart phone Orientation Estimation , 2012 .

[10]  Kaoru Sezaki,et al.  Beyond horizontal location context: measuring elevation using smartphone's barometer , 2014, UbiComp Adjunct.

[11]  Emiliano Miluzzo,et al.  A survey of mobile phone sensing , 2010, IEEE Communications Magazine.

[12]  Jennifer E. Rowley,et al.  The wisdom hierarchy: representations of the DIKW hierarchy , 2007, J. Inf. Sci..

[13]  Fanglin Chen,et al.  StudentLife: assessing mental health, academic performance and behavioral trends of college students using smartphones , 2014, UbiComp.

[14]  Bernt Schiele,et al.  Weakly Supervised Recognition of Daily Life Activities with Wearable Sensors , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Sung-Ju Lee,et al.  Intelligent positive computing with mobile, wearable, and IoT devices: Literature review and research directions , 2019, Ad Hoc Networks.

[16]  Veda C. Storey,et al.  Business Intelligence and Analytics: From Big Data to Big Impact , 2012, MIS Q..