Unobtrusive Activity Recognition and Position Estimation for Work Surfaces Using RF-Radar Sensing

Activity recognition is a core component of many intelligent and context-aware systems. We present a solution for discreetly and unobtrusively recognizing common work activities above a work surface without using cameras. We demonstrate our approach, which utilizes an RF-radar sensor mounted under the work surface, in three domains: recognizing work activities at a convenience-store counter, recognizing common office deskwork activities, and estimating the position of customers in a showroom environment. Our examples illustrate potential benefits for both post-hoc business analytics and for real-time applications. Our solution was able to classify seven clerk activities with 94.9% accuracy using data collected in a lab environment and able to recognize six common deskwork activities collected in real offices with 95.3% accuracy. Using two sensors simultaneously, we demonstrate coarse position estimation around a large surface with 95.4% accuracy. We show that using multiple projections of RF signal leads to improved recognition accuracy. Finally, we show how smartwatches worn by users can be used to attribute an activity, recognized with the RF sensor, to a particular user in multi-user scenarios. We believe our solution can mitigate some of users’ privacy concerns associated with cameras and is useful for a wide range of intelligent systems.

[1]  Didier Stricker,et al.  Creating and benchmarking a new dataset for physical activity monitoring , 2012, PETRA '12.

[2]  John C. Tang,et al.  Lilsys: Sensing Unavailability , 2004, CSCW.

[3]  Dima Damen,et al.  Egocentric Real-time Workspace Monitoring using an RGB-D camera , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Rémi Ronfard,et al.  A survey of vision-based methods for action representation, segmentation and recognition , 2011, Comput. Vis. Image Underst..

[5]  Wen Hu,et al.  Radio-based device-free activity recognition with radio frequency interference , 2015, IPSN.

[6]  Ju Wang,et al.  TagScan: Simultaneous Target Imaging and Material Identification with Commodity RFID Devices , 2017, MobiCom.

[7]  Yoo-Joo Choi,et al.  SmartBuckle: human activity recognition using a 3-axis accelerometer and a wearable camera , 2008, HealthNet '08.

[8]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.

[9]  Parth H. Pathak,et al.  Analyzing Shopper's Behavior through WiFi Signals , 2015, WPA@MobiSys.

[10]  Changzhan Gu,et al.  A Hybrid FMCW-Interferometry Radar for Indoor Precise Positioning and Versatile Life Activity Monitoring , 2014, IEEE Transactions on Microwave Theory and Techniques.

[11]  Majid Sarrafzadeh,et al.  A Self-Calibrating Radar Sensor System for Measuring Vital Signs , 2016, IEEE Transactions on Biomedical Circuits and Systems.

[12]  David Parker,et al.  Design-driven research for workplace exergames: the limber case study , 2013, Gamification.

[13]  Liang Liu,et al.  Automatic fall detection based on Doppler radar motion signature , 2011, 2011 5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops.

[14]  Wenyao Xu,et al.  Cardiac Scan: A Non-contact and Continuous Heart-based User Authentication System , 2017, MobiCom.

[15]  Shuang Wang,et al.  A Review on Human Activity Recognition Using Vision-Based Method , 2017, Journal of healthcare engineering.

[16]  Ivan Poupyrev,et al.  Soli , 2016, ACM Trans. Graph..

[17]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[18]  Ricardo Chavarriaga,et al.  The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition , 2013, Pattern Recognit. Lett..

[19]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[20]  Jenq-Neng Hwang,et al.  A Review on Video-Based Human Activity Recognition , 2013, Comput..

[21]  David Wetherall,et al.  Recognizing daily activities with RFID-based sensors , 2009, UbiComp.

[22]  A. Schmidt,et al.  CapTable and CapShelf - Unobtrusive Activity Recognition Using Networked Capacitive Sensors , 2007, 2007 Fourth International Conference on Networked Sensing Systems.

[23]  Héctor Pomares,et al.  A benchmark dataset to evaluate sensor displacement in activity recognition , 2012, UbiComp.

[24]  Zimu Zhou,et al.  Enabling Gesture-based Interactions with Objects , 2017, MobiSys.

[25]  Ivan Poupyrev,et al.  Interacting with Soli: Exploring Fine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum , 2016, UIST.

[26]  Gerhard Tröster,et al.  The telepathic phone: Frictionless activity recognition from WiFi-RSSI , 2014, 2014 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[27]  Archan Misra,et al.  IRIS: Tapping wearable sensing to capture in-store retail insights on shoppers , 2016, 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[28]  Patrick Schrempf,et al.  RadarCat: Radar Categorization for Input & Interaction , 2016, UIST.

[29]  Boyang Li,et al.  Reducing Interruptions at Work: A Large-Scale Field Study of FlowLight , 2017, CHI.

[30]  Radu-Daniel Vatavu,et al.  Detecting and Tracking Multiple Users in the Proximity of Interactive Tabletops , 2008 .

[31]  Daniel Avrahami,et al.  BreakSense: Combining Physiological and Location Sensing to Promote Mobility during Work-Breaks , 2017, CHI.

[32]  Johannes Schöning,et al.  Informing intelligent user interfaces by inferring affective states from body postures in ubiquitous computing environments , 2013, IUI '13.

[33]  Gang Zhou,et al.  RadioSense: Exploiting Wireless Communication Patterns for Body Sensor Network Activity Recognition , 2012, 2012 IEEE 33rd Real-Time Systems Symposium.

[34]  Michael Beigl,et al.  Device-free and device-bound activity recognition using radio signal strength , 2013, AH.

[35]  Christopher G. Atkeson,et al.  Predicting human interruptibility with sensors , 2005, TCHI.

[36]  Eric Horvitz,et al.  BusyBody: creating and fielding personalized models of the cost of interruption , 2004, CSCW.

[37]  Wei Xi,et al.  Device-free detection of approach and departure behaviors using backscatter communication , 2016, UbiComp.

[38]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[39]  Timo Sztyler,et al.  Unsupervised recognition of interleaved activities of daily living through ontological and probabilistic reasoning , 2016, UbiComp.

[40]  Mitesh Patel,et al.  Below the Surface: Unobtrusive Activity Recognition for Work Surfaces using RF-radar sensing , 2018, IUI.

[41]  Ronald Poppe,et al.  A survey on vision-based human action recognition , 2010, Image Vis. Comput..

[42]  Henry A. Kautz,et al.  Inferring activities from interactions with objects , 2004, IEEE Pervasive Computing.

[43]  Jeff A. Bilmes,et al.  A weakly supervised activity recognition framework for real-time synthetic biology laboratory assistance , 2016, UbiComp.

[44]  Nicolai Marquardt,et al.  The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies , 2011, UIST.

[45]  Gregory D. Abowd,et al.  Inferring Meal Eating Activities in Real World Settings from Ambient Sounds: A Feasibility Study , 2015, IUI.

[46]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[47]  James A. Landay,et al.  The Mobile Sensing Platform: An Embedded Activity Recognition System , 2008, IEEE Pervasive Computing.

[48]  Gierad Laput,et al.  Synthetic Sensors: Towards General-Purpose Sensing , 2017, CHI.

[49]  Paul Lukowicz,et al.  Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[50]  Hrvoje Benko,et al.  CrossMotion: Fusing Device and Image Motion for User Identification, Tracking and Device Association , 2014, ICMI.

[51]  Chenyang Zhang,et al.  RGB-D Camera-based Daily Living Activity Recognition , 2022 .

[52]  Frédo Durand,et al.  Capturing the human figure through a wall , 2015, ACM Trans. Graph..

[53]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.