Below the Surface: Unobtrusive Activity Recognition for Work Surfaces using RF-radar sensing

Activity recognition is a core component of many intelligent and context-aware systems. In this paper, we present a solution for discreetly and unobtrusively recognizing common work activities above a work surface without using cameras. We demonstrate our approach, which utilizes an RF-radar sensor mounted under the work surface, in two work domains; recognizing work activities at a convenience-store counter (useful for post-hoc analytics) and recognizing common office deskwork activities (useful for real-time applications). We classify seven clerk activities with 94.9% accuracy using data collected in a lab environment, and recognize six common deskwork activities collected in real offices with 95.3% accuracy. We show that using multiple projections of RF signal leads to improved recognition accuracy. Finally, we show how smartwatches worn by users can be used to attribute an activity, recognized with the RF sensor, to a particular user in multi-user scenarios. We believe our solution can mitigate some of users privacy concerns associated with cameras and is useful for a wide range of intelligent systems.

[1]  Ivan Poupyrev,et al.  Interacting with Soli: Exploring Fine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum , 2016, UIST.

[2]  Changzhan Gu,et al.  A Hybrid FMCW-Interferometry Radar for Indoor Precise Positioning and Versatile Life Activity Monitoring , 2014, IEEE Transactions on Microwave Theory and Techniques.

[3]  A. Schmidt,et al.  CapTable and CapShelf - Unobtrusive Activity Recognition Using Networked Capacitive Sensors , 2007, 2007 Fourth International Conference on Networked Sensing Systems.

[4]  Jeff A. Bilmes,et al.  A weakly supervised activity recognition framework for real-time synthetic biology laboratory assistance , 2016, UbiComp.

[5]  Boyang Li,et al.  Reducing Interruptions at Work: A Large-Scale Field Study of FlowLight , 2017, CHI.

[6]  Nicolai Marquardt,et al.  The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies , 2011, UIST.

[7]  Christopher G. Atkeson,et al.  Predicting human interruptibility with sensors , 2005, TCHI.

[8]  Henry A. Kautz,et al.  Inferring activities from interactions with objects , 2004, IEEE Pervasive Computing.

[9]  David Parker,et al.  Design-driven research for workplace exergames: the limber case study , 2013, Gamification.

[10]  Eric Horvitz,et al.  BusyBody: creating and fielding personalized models of the cost of interruption , 2004, CSCW.

[11]  Timo Sztyler,et al.  Unsupervised recognition of interleaved activities of daily living through ontological and probabilistic reasoning , 2016, UbiComp.

[12]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[13]  Gregory D. Abowd,et al.  Inferring Meal Eating Activities in Real World Settings from Ambient Sounds: A Feasibility Study , 2015, IUI.

[14]  James A. Landay,et al.  The Mobile Sensing Platform: An Embedded Activity Recognition System , 2008, IEEE Pervasive Computing.

[15]  Gierad Laput,et al.  Synthetic Sensors: Towards General-Purpose Sensing , 2017, CHI.

[16]  Paul Lukowicz,et al.  Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Liang Liu,et al.  Automatic fall detection based on Doppler radar motion signature , 2011, 2011 5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops.

[18]  Ricardo Chavarriaga,et al.  The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition , 2013, Pattern Recognit. Lett..

[19]  Ronald Poppe,et al.  A survey on vision-based human action recognition , 2010, Image Vis. Comput..

[20]  Parth H. Pathak,et al.  Analyzing Shopper's Behavior through WiFi Signals , 2015, WPA@MobiSys.

[21]  Archan Misra,et al.  IRIS: Tapping wearable sensing to capture in-store retail insights on shoppers , 2016, 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[22]  Patrick Schrempf,et al.  RadarCat: Radar Categorization for Input & Interaction , 2016, UIST.

[23]  Yoo-Joo Choi,et al.  SmartBuckle: human activity recognition using a 3-axis accelerometer and a wearable camera , 2008, HealthNet '08.

[24]  John C. Tang,et al.  Lilsys: Sensing Unavailability , 2004, CSCW.

[25]  Rémi Ronfard,et al.  A survey of vision-based methods for action representation, segmentation and recognition , 2011, Comput. Vis. Image Underst..

[26]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[27]  Wei Xi,et al.  Device-free detection of approach and departure behaviors using backscatter communication , 2016, UbiComp.

[28]  Ivan Poupyrev,et al.  Soli , 2016, ACM Trans. Graph..

[29]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[30]  Didier Stricker,et al.  Creating and benchmarking a new dataset for physical activity monitoring , 2012, PETRA '12.

[31]  Dima Damen,et al.  Egocentric Real-time Workspace Monitoring using an RGB-D camera , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[32]  Chenyang Zhang,et al.  RGB-D Camera-based Daily Living Activity Recognition , 2022 .

[33]  Frédo Durand,et al.  Capturing the human figure through a wall , 2015, ACM Trans. Graph..

[34]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.

[35]  Jenq-Neng Hwang,et al.  A Review on Video-Based Human Activity Recognition , 2013, Comput..

[36]  David Wetherall,et al.  Recognizing daily activities with RFID-based sensors , 2009, UbiComp.

[37]  Daniel Avrahami,et al.  BreakSense: Combining Physiological and Location Sensing to Promote Mobility during Work-Breaks , 2017, CHI.

[38]  Johannes Schöning,et al.  Informing intelligent user interfaces by inferring affective states from body postures in ubiquitous computing environments , 2013, IUI '13.

[39]  Héctor Pomares,et al.  A benchmark dataset to evaluate sensor displacement in activity recognition , 2012, UbiComp.

[40]  Hrvoje Benko,et al.  CrossMotion: Fusing Device and Image Motion for User Identification, Tracking and Device Association , 2014, ICMI.