Mobile Apps: It's Time to Move Up to CondOS

Sensing is a significant contributor to the current mobile computing revolution. Today’s typical smartphone has more than eight sensors, including multiple mics, cameras, accelerometers, gyroscopes, a GPS, a digital compass, and proximity sensors. These sensors not only provide natural user interaction with the device, but also offer tantalizing opportunities for context-aware computing. A rich history of work has investigated algorithms for converting raw sensor data into context, and their specific usages [3]. To cite just two examples: A restaurant finder app may adjust its search radius depending on whether a user is on foot, cycling, or driving, which can be inferred from GPS and IMU readings [11]. A Twitter app might choose to alert the user of her latest updates at an interruptible moment such as when she is not engaged in conversation, which can be inferred from the mic’s audio [5]. The ingredients for context appear ready: the sensing hardware, the data processing algorithms and the application scenarios are all primed. The question that emerges is: who is responsible for context generation? One option is for apps to manage their own context generation. This approach appears appealing because apps are most familiar with their own context needs. However, many mobile OSs such as iPhone’s iOS and Windows Phone’s WP7 harbor legitimate energy concerns and severely restrict non-foreground processing. As a result, an app may generate context from immediately available sensor data, but is unable to maintain context while outside the scope of its execution. This can be as simple as missing the accelerometer’s transition from sitting to standing, since sensing either state outside the transition period does not yield distinguishing information. Alternatively, Android apps may run in the background, but then the user is at the mercy of the flawless app developer to use resources intelligently. Another option is to simply ship all sensor data to the cloud

[1]  E.A. Lee,et al.  Synchronous data flow , 1987, Proceedings of the IEEE.

[2]  EDDIE KOHLER,et al.  The click modular router , 2000, TOCS.

[3]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[4]  Christopher G. Atkeson,et al.  Predicting human interruptibility with sensors , 2005, TCHI.

[5]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[6]  Joseph A. Paradiso,et al.  Gait Analysis Using a Shoe-Integrated Wireless Sensor System , 2008, IEEE Transactions on Information Technology in Biomedicine.

[7]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[8]  Alec Wolman,et al.  I am a sensor, and I approve this message , 2010, HotMobile '10.

[9]  Feng Zhao,et al.  Energy-accuracy trade-off for continuous mobile device location , 2010, MobiSys '10.

[10]  Byung-Gon Chun,et al.  TaintDroid: An Information-Flow Tracking System for Realtime Privacy Monitoring on Smartphones , 2010, OSDI.

[11]  Jon Howell,et al.  What You See is What They Get: Protecting users from unwanted use of microphones, cameras, and other sensors , 2010 .

[12]  Dimitrios Lymberopoulos,et al.  EERS : Energy Efficient Responsive Sleeping on Mobile Phones , 2010 .

[13]  Jie Liu,et al.  Enabling energy efficient continuous sensing on mobile phones with LittleRock , 2010, IPSN '10.

[14]  Hari Balakrishnan,et al.  "Extra-sensory perception" for wireless networks , 2010, Hotnets-IX.

[15]  Jie Liu,et al.  LittleRock: Enabling Energy-Efficient Continuous Sensing on Mobile Phones , 2011, IEEE Pervasive Computing.