AtLAS: An Activity-Based Indoor Localization and Semantic Labeling Mechanism for Residences

Currently, indoor localization technology and indoor location-based services are becoming increasingly important in the area of mobile and ubiquitous computing. However, the design of an indoor location-based system confronts two challenges: 1) achieving high-precision location recognition and 2) identifying what indoor objects actually are (which is called semantic labeling). In this article, we propose AtLAS, an activity-based indoor localization and semantic labeling mechanism. The key idea is that some objects in an indoor environment, such as doors and toilets, determine predictable human behaviors in small areas, which can be reflected in unique sensor readings. AtLAS leverages this idea to determine a user’s accurate location by identifying users’ activities. Furthermore, we leverage the topological structure of indoor objects to mine the semantic knowledge and label the objects through gained knowledge automatically. To the best of our knowledge, AtLAS is the first attempt to build a system that leverages users’ activities to conduct a high-precision indoor localization and semantic labeling system for the case of residences. The experimental results show that AtLAS can achieve a median localization accuracy of 0.57 m, and the system can localize the landmarks with a median accuracy of 0.43 m on average without 5% worst errors. AtLAS can label the objects semantically with a 5.7% false-positive rate and a 5.8% false-negative rate on average.

[1]  Christian Wolf,et al.  Glimpse Clouds: Human Activity Recognition from Unstructured Feature Points , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[2]  RollCaller: User-friendly indoor navigation system using human-item spatial relation , 2014, IEEE INFOCOM 2014 - IEEE Conference on Computer Communications.

[3]  Kaigui Bian,et al.  Multi-Story Indoor Floor Plan Reconstruction via Mobile Crowdsensing , 2016, IEEE Transactions on Mobile Computing.

[4]  Chang Liu,et al.  SweepLoc: Automatic Video-based Indoor Localization by Camera Sweeping , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[5]  Yong Du,et al.  Hierarchical recurrent neural network for skeleton based action recognition , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  NICHOLAS D. LANE,et al.  Gain Without Pain: Accurate WiFi-based Localization using Fingerprint Spatial Gradient , 2017 .

[7]  Nirwan Ansari,et al.  Indoor Localization by Fusing a Group of Fingerprints Based on Random Forests , 2017, IEEE Internet of Things Journal.

[8]  Keiichi Yasumoto,et al.  Feasibility of human activity recognition using wearable depth cameras , 2018, UbiComp.

[9]  Agathoniki Trigoni,et al.  Lightweight map matching for indoor localisation using conditional random fields , 2014, IPSN-14 Proceedings of the 13th International Symposium on Information Processing in Sensor Networks.

[10]  Moustafa Youssef,et al.  No need to war-drive: unsupervised indoor localization , 2012, MobiSys '12.

[11]  Xiang Li,et al.  IndoTrack , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[12]  Thomas Plötz,et al.  Ensembles of Deep LSTM Learners for Activity Recognition using Wearables , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[13]  Kaigui Bian,et al.  Jigsaw: indoor floor plan reconstruction via mobile crowdsensing , 2014, MobiCom.

[14]  Xia Zhou,et al.  Augmenting Indoor Inertial Tracking with Polarized Light , 2018, MobiSys.

[15]  Kamin Whitehouse,et al.  Multipath Triangulation: Decimeter-level WiFi Localization and Orientation with a Single Unaided Receiver , 2018, MobiSys.

[16]  Yang Li,et al.  Simultaneous Localization and Mapping with Power Network Electromagnetic Field , 2018, MobiCom.

[17]  Anoop Cherian,et al.  Non-linear Temporal Subspace Representations for Activity Recognition , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[18]  Yuanyuan Yang,et al.  Geomagnetism-Based Indoor Navigation by Offloading Strategy in NB-IoT , 2019, IEEE Internet of Things Journal.

[19]  Xiaoji Niu,et al.  A Localization Database Establishment Method Based on Crowdsourcing Inertial Sensor Data and Quality Assessment Criteria , 2018, IEEE Internet of Things Journal.

[20]  Kaigui Bian,et al.  Towards ubiquitous indoor localization service leveraging environmental physical features , 2014, IEEE INFOCOM 2014 - IEEE Conference on Computer Communications.

[21]  Richard P. Martin,et al.  Sensing vehicle dynamics for determining driver phone use , 2013, MobiSys '13.

[22]  Hirozumi Yamaguchi,et al.  TransitLabel: A Crowd-Sensing System for Automatic Labeling of Transit Stations Semantics , 2016, MobiSys.

[23]  Xinyu Zhang,et al.  Enabling High-Precision Visible Light Localization in Today's Buildings , 2017, MobiSys.

[24]  Youngnam Han,et al.  SmartPDR: Smartphone-Based Pedestrian Dead Reckoning for Indoor Localization , 2015, IEEE Sensors Journal.

[25]  Yue Cao,et al.  A Pervasive Integration Platform of Low-Cost MEMS Sensors and Wireless Signals for Indoor Localization , 2018, IEEE Internet of Things Journal.

[26]  Derya Birant,et al.  ST-DBSCAN: An algorithm for clustering spatial-temporal data , 2007, Data Knowl. Eng..

[27]  Wolfram Burgard,et al.  Activity-Based Estimation of Human Trajectories , 2012, IEEE Transactions on Robotics.

[28]  Bo Yu,et al.  Convolutional Neural Networks for human activity recognition using mobile sensors , 2014, 6th International Conference on Mobile Computing, Applications and Services.

[29]  LiDeying,et al.  Accurate and Efficient Indoor Location by Dynamic Warping in Sequence-Type Radio-map , 2018 .

[30]  Rainer Stiefelhagen,et al.  CNN-based sensor fusion techniques for multimodal human activity recognition , 2017, SEMWEB.

[31]  Wei Sun,et al.  Locate the Mobile Device by Enhancing the WiFi-Based Indoor Localization Model , 2019, IEEE Internet of Things Journal.

[32]  Feng Zhao,et al.  A reliable and accurate indoor localization method using phone inertial sensors , 2012, UbiComp.

[33]  Mu Zhou,et al.  Robust Neighborhood Graphing for Semi-Supervised Indoor Localization With Light-Loaded Location Fingerprinting , 2018, IEEE Internet of Things Journal.

[34]  Yunhao Liu,et al.  Indoor localization via multi-modal sensing on smartphones , 2016, UbiComp.

[35]  Guobin Shen,et al.  Walkie-Markie: Indoor Pathway Mapping Made Easy , 2013, NSDI.

[36]  Paolo Fornacciari,et al.  IoT Wearable Sensor and Deep Learning: An Integrated Approach for Personalized Human Activity Recognition in a Smart Home Environment , 2019, IEEE Internet of Things Journal.

[37]  Moustafa Youssef,et al.  SemanticSLAM: Using Environment Landmarks for Unsupervised Indoor Localization , 2016, IEEE Transactions on Mobile Computing.

[38]  Nirwan Ansari,et al.  Accurate WiFi Localization by Unsupervised Fusion of Extended Candidate Location Set , 2019, IEEE Internet of Things Journal.

[39]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.

[40]  Fan Li,et al.  SoundMark: Accurate Indoor Localization via Peer-Assisted Dead Reckoning , 2018, IEEE Internet of Things Journal.

[41]  Lei Guo,et al.  DIMLOC: Enabling High-Precision Visible Light Localization Under Dimmable LEDs in Smart Buildings , 2019, IEEE Internet of Things Journal.

[42]  Kyu-Han Kim,et al.  SAIL: single access point-based indoor localization , 2014, MobiSys.

[43]  Masamichi Shimosaka,et al.  Robust Indoor Localization across Smartphone Models with Ellipsoid Features from Multiple RSSIs , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[44]  Zhu Xiao,et al.  WiFiMap+: High-Level Indoor Semantic Inference With WiFi Human Activity and Environment , 2019, IEEE Transactions on Vehicular Technology.