Remotion: A Motion-Based Capture and Replay Platform of Mobile Device Interaction for Remote Usability Testing

Remotion is an end-to-end system for capturing and replaying rich mobile device interactions, comprising both on-screen video and physical device motions. The blueprints and software provided here allow an interface to be instrumented with Remotion's capture and visualization system. Remotion is able to mimic mobile device motion through a software 3D graphical visualization and a robotic mount that replicates the movements of a mobile device from afar. Deployed together, experimenters can emulate the mobile device postures of a remote user as if they were in the room. This is important since many usability studies are carried remotely and the contribution and scale of those studies are irreplaceable. We compared how HCI experts ("analysts") observed remote users behavioral data across three replay platforms: a traditional live time-series of motion, Remotion's software visualization, and Remotion's hardware visualization. We found that Remotion can assist analysts to infer the user's attention, emotional state, habits, and active hand posture; Remotion also has a reduced effect on mental demand for analysts when analyzing the remote user's contextual information.

[1]  Daniel Gatica-Perez,et al.  Smartphone usage in the wild: a large-scale analysis of applications and context , 2011, ICMI '11.

[2]  Jan Dijkstra,et al.  Towards a multi-agent model for visualizing simulated user behavior to support the assessment of design performance , 2002 .

[3]  Michael Kipp Multimedia Annotation , Querying and Analysis in ANVIL , 2009 .

[4]  Deborah Hix,et al.  Remote usability evaluation: can users report their own critical incidents? , 1998, CHI Conference Summary.

[5]  Tetsuo Ono,et al.  Development and evaluation of an interactive humanoid robot "Robovie" , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[6]  Kathleen M. MacQueen,et al.  Applied Thematic Analysis , 2011 .

[7]  Andreas Butz,et al.  Activity Sculptures: Exploring the Impact of Physical Visualizations on Running Activity , 2014, IEEE Trans. Vis. Comput. Graph..

[8]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[9]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[10]  Hideaki Kuzuoka,et al.  GestureMan: a mobile robot that embodies a remote instructor's actions , 2000, CSCW '00.

[11]  Walter de Abreu Cybis,et al.  Usability Testing of Mobile Devices: A Comparison of Three Approaches , 2005, INTERACT.

[12]  Sarah Waterson,et al.  In the lab and out in the wild: remote web usability testing for mobile devices , 2002, CHI Extended Abstracts.

[13]  Aniket Kittur,et al.  Crowdsourcing user studies with Mechanical Turk , 2008, CHI.

[14]  Susan M. Dray,et al.  Remote possibilities?: international usability testing at a distance , 2004, INTR.

[15]  Pierre Dragicevic,et al.  Evaluating the efficiency of physical visualizations , 2013, CHI.

[16]  V. Braun,et al.  Using thematic analysis in psychology , 2006 .

[17]  Constantinos K. Coursaris,et al.  A Meta-Analytical Review of Empirical Mobile Usability Studies , 2011 .

[18]  Jan Stage,et al.  What happened to remote usability testing?: an empirical study of three methods , 2007, CHI.

[19]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[20]  Anastasia Bezerianos,et al.  Coordination of tilt and touch in one- and two-handed use , 2014, CHI.

[21]  Kun Qian,et al.  Developing a Gesture Based Remote Human-Robot Interaction System Using Kinect , 2013 .

[22]  Faisal Taher,et al.  Investigating the Use of a Dynamic Physical Bar Chart for Data Exploration and Presentation , 2017, IEEE Transactions on Visualization and Computer Graphics.

[23]  Anders Bruun,et al.  Let your users do the testing: a comparison of three remote asynchronous usability testing methods , 2009, CHI.

[24]  Markus H. Gross,et al.  Motion based remote camera control with mobile devices , 2016, MobileHCI.

[25]  Mark T. Maybury,et al.  Multimedia Annotation, Querying, and Analysis in Anvil , 2011 .

[26]  Hiroshi Ishii,et al.  Augmented urban planning workbench: overlaying drawings, physical models and digital simulation , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[27]  Hao Chen,et al.  TouchLogger: Inferring Keystrokes on Touch Screen from Smartphone Motion , 2011, HotSec.