Towards Urban Environment Familiarity Prediction

Location Based Services (LBS) are definitely very helpful for people that interact within an unfamiliar environment, but also for those that already possess a certain level of familiarity with it. In order to avoid overwhelming familiar users with unnecessary information, the level of details offered by the LBS shall be adapted to the level of familiarity with the environment: providing more details to unfamiliar users and a lighter amount of information (that would be superfluous, if not even misleading) to the users that are more familiar with the current environment. Currently, the information exchange between the service and its users is not taking into account familiarity. Within this work, we investigate the potential of machine learning for a binary classification of environment familiarity (i.e., familiar vs unfamiliar) with the surrounding environment. For this purpose, a 3D virtual environment based on a part of Vienna, Austria was designed using datasets from the municipal government. During a navigation experiment with 22 participants we collected ground truth data in order to train four machine learning algorithms. The captured data included motion and orientation of the users as well as visual interaction with the surrounding buildings during navigation. This work demonstrates the potential of machine learning for predicting the state of familiarity as an enabling step for the implementation of LBS better tailored to the user.

[1]  Krzysztof Janowicz,et al.  xNet+SC: Classifying Places Based on Images by Incorporating Spatial Contexts , 2018, GIScience.

[2]  Maximilian Schirmer,et al.  Shoe me the Way: A Shoe-Based Tactile Interface for Eyes-Free Urban Navigation , 2015, MobileHCI.

[3]  Philip S. Yu,et al.  Transportation mode detection using mobile phones and GIS information , 2011, GIS.

[4]  J Elith,et al.  A working guide to boosted regression trees. , 2008, The Journal of animal ecology.

[5]  Georg Gartner,et al.  Location based services: ongoing evolution and research agenda , 2018, J. Locat. Based Serv..

[6]  Martin Raubal,et al.  GazeNav: Gaze-Based Pedestrian Navigation , 2015, MobileHCI.

[7]  Ioannis Giannopoulos,et al.  Intersections of Our World , 2018, GIScience.

[8]  Martin Raubal,et al.  Eye tracking for spatial research: Cognition, computation, challenges , 2017, Spatial Cogn. Comput..

[9]  Chao Li,et al.  User preferences, information transactions and location-based services: A study of urban pedestrian wayfinding , 2006, Comput. Environ. Urban Syst..

[10]  Christoph Hölscher,et al.  Virtual reality as an empirical research tool - Exploring user experience in a real building and a corresponding virtual model , 2015, Comput. Environ. Urban Syst..

[11]  Fabio Paternò,et al.  Human Computer Interaction with Mobile Devices , 2002, Lecture Notes in Computer Science.

[12]  Antonio Krüger,et al.  Investigating the effectiveness of peephole interaction for smartwatches in a map navigation task , 2014, MobileHCI '14.

[13]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[14]  Anthony E. Richardson,et al.  Development of a self-report measure of environmental spatial ability. , 2002 .

[15]  C. Anderson,et al.  External Validity of “Trivial” Experiments: The Case of Laboratory Aggression , 1997 .

[16]  Alexandra Millonig,et al.  Human-centred Mobile Pedestrian Navigation Systems , 2011 .

[17]  Ioannis Giannopoulos,et al.  Geospatial Semantics for Spatial Prediction (Short Paper) , 2018, GIScience.

[18]  Chih-Jen Lin,et al.  A comparison of methods for multiclass support vector machines , 2002, IEEE Trans. Neural Networks.