Using Location, Bearing and Motion Data to Filter Video and System Logs

In evaluating and analysing a pervasive computing system, it is common to log system use and to create video recordings of users. A lot of data will often be generated, representing potentially long periods of user activity. We present a procedure to identify sections of such data that are salient given the current context of analysis; for example analysing the activity of a particular person among many trial participants recorded by multiple cameras. By augmenting the cameras used to capture a mobile experiment, we are able to establish both a location and heading for each camera, and thus model the field of view for each camera over time. Locations of trial participants are also recorded and compared against camera views, to determine which periods of user activity are likely to have been recorded in detail. Additionally the stability of a camera can be tracked and video can be subsequently filtered to exclude footage of unacceptable quality. These techniques are implemented in an extension to Replayer: a software toolkit for use in the development cycle of mobile applications. A report of initial testing is given, whereby the technique's use is demonstrated on a representative mobile application.

[1]  Jun Rekimoto,et al.  UbiComp 2005: Ubiquitous Computing, 7th International Conference, UbiComp 2005, Tokyo, Japan, September 11-14, 2005, Proceedings , 2005, UbiComp.

[2]  Kiyoharu Aizawa,et al.  Person Tracking and Multicamera Video Retrieval Using Floor Sensors in a Ubiquitous Environment , 2005, CIVR.

[3]  Kiyoharu Aizawa,et al.  Indexing of Personal Video Captured by a Wearable Imaging , 2003, CIVR.

[4]  Mark Guzdial,et al.  A user interface evaluation environment using synchronized video, visualizations and event trace data , 1995, Software Quality Journal.

[5]  A. Morrison,et al.  Coordinated Visualisation of Video and System Log Data , 2006, Fourth International Conference on Coordinated & Multiple Views in Exploratory Visualization (CMV'06).

[6]  John Weston,et al.  Strapdown Inertial Navigation Technology, Second Edition , 2005 .

[7]  William G. Griswold,et al.  Harnessing mobile ubiquitous video , 2005, CHI EA '05.

[8]  Duncan Rowland,et al.  Interweaving mobile games with everyday life , 2006, CHI.

[9]  Kent Lyons,et al.  Mobile capture for wearable computer usability testing , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[10]  Richard A. Becker,et al.  Brushing scatterplots , 1987 .

[11]  Matthew Chalmers,et al.  Seamful interweaving: heterogeneity in the theory and design of interactive systems , 2004, DIS '04.

[12]  Ashweeni Kumar Beeharee,et al.  Filtering Location-Based Information Using Visibility , 2005, LoCA.

[13]  Wei-Ying Ma,et al.  Image and Video Retrieval , 2003, Lecture Notes in Computer Science.

[14]  Ian Oakley,et al.  MESH : Supporting Mobile Multi-modal Interfaces , 2004 .

[15]  Matthew Chalmers,et al.  Recording and Understanding Mobile People and Mobile Technology , 2005 .

[16]  Kiyoharu Aizawa,et al.  Summarizing wearable video , 2001, Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205).

[17]  Matthew Chalmers,et al.  Picking Pockets on the Lawn: The Development of Tactics and Strategies in a Mobile Game , 2005, UbiComp.