Astral: Prototyping Mobile and Smart Object Interactive Behaviours Using Familiar Applications

Astral is a prototyping tool for authoring mobile and smart object interactive behaviours. It mirrors selected display contents of desktop applications onto mobile devices (smartphones and smartwatches), and streams/remaps mobile sensor data to desktop input events (mouse or keyboard) to manipulate selected desktop contents. This allows designers to use familiar desktop applications (e.g. PowerPoint, AfterEffects) to prototype rich interactive behaviours. Astral combines and integrates display mirroring, sensor streaming and input remapping, where designers can exploit familiar desktop applications to prototype, explore and fine-tune dynamic interactive behaviours. With Astral, designers can visually author rules to test real-time behaviours while interactions take place, as well as after the interaction has occurred. We demonstrate Astral's applicability, workflow and expressiveness within the interaction design process through both new examples and replication of prior approaches that illustrate how various familiar desktop applications are leveraged and repurposed.

[1]  Brad A. Myers Mobile Devices for Control , 2002, Mobile HCI.

[2]  Dan Saffer,et al.  Microinteractions: Designing with Details , 2013 .

[3]  Yang Li,et al.  Gesture morpher: video-based retargeting of multi-touch interactions , 2016, MobileHCI.

[4]  Roel Vertegaal,et al.  Fitts' Law and the Effects of Input Mapping and Stiffness on Flexible Display Interactions , 2016, CHI.

[5]  Thad Starner,et al.  MAGIC: a motion gesture design tool , 2010, CHI.

[6]  Tek-Jin Nam,et al.  EventHurdle: supporting designers' exploratory interaction prototyping with gesture-based sensors , 2013, CHI.

[7]  Charles Hill,et al.  What do Prototypes Prototype , 1997 .

[8]  Rubaiat Habib Kazi,et al.  Kitty: sketching dynamic and interactive illustrations , 2014, UIST.

[9]  Jeffrey Nichols,et al.  Interacting at a Distance Using Semantic Snarfing , 2001, UbiComp.

[10]  David Shaw,et al.  Makey Makey: improvising tangible and nature-based user interfaces , 2012, TEI.

[11]  M. Sheelagh T. Carpendale,et al.  Sketching User Experiences - The Workbook , 2011 .

[12]  Olivier Chapuis,et al.  User interface façades: towards fully adaptable user interfaces , 2006, UIST.

[13]  Scott R. Klemmer,et al.  Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition , 2007, CHI.

[14]  James A. Landay,et al.  Employing patterns and layers for early-stage design and prototyping of cross-device user interfaces , 2008, CHI.

[15]  J. F. Kelley,et al.  An iterative design methodology for user-friendly natural language office information applications , 1984, TOIS.

[16]  Chris North,et al.  The Signals and Systems Approach to Animation , 2017, ArXiv.

[17]  Laurent Pujo-Menjouet,et al.  The illusion of life , 2017, ECAL.

[18]  Olivier Bau,et al.  OctoPocus: a dynamic guide for learning gesture-based command sets , 2008, UIST '08.

[19]  Amos Azaria,et al.  SUGILITE: Creating Multimodal Smartphone Automation by Demonstration , 2017, CHI.

[20]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[21]  Morgan Dixon,et al.  Prefab layers and prefab annotations: extensible pixel-based interpretation of graphical interfaces , 2014, UIST.

[22]  Brad A. Myers,et al.  How designers design and program interactive behaviors , 2008, 2008 IEEE Symposium on Visual Languages and Human-Centric Computing.

[23]  Luís Carriço,et al.  A mixed-fidelity prototyping tool for mobile devices , 2008, AVI '08.

[24]  Tovi Grossman,et al.  Pineal: Bringing Passive Objects to Life with Embedded Mobile Devices , 2017, CHI.

[25]  Steven K. Feiner,et al.  Virtual projection: exploring optical projection as a metaphor for multi-device interaction , 2012, CHI.

[26]  Wendy E. Mackay,et al.  Design Breakdowns: Designer-Developer Gaps in Representing and Interpreting Interactive Systems , 2017, CSCW.

[27]  Justin Matejka,et al.  AMI: An Adaptable Music Interface to Support the Varying Needs of People with Dementia , 2017, ASSETS.

[28]  Saul Greenberg,et al.  Toolkits and interface creativity , 2007, Multimedia Tools and Applications.

[29]  Lars Erik Holmquist,et al.  Prototyping , 2005, Interactions.

[30]  William Buxton,et al.  Usability evaluation considered harmful (some of the time) , 2008, CHI.

[31]  Mitchel Resnick,et al.  Real-time programming and the big ideas of computational literacy , 2003 .

[32]  M. Sheelagh T. Carpendale,et al.  Transmogrification: causal manipulation of visualizations , 2013, UIST.

[33]  Michael S. Bernstein,et al.  Reflective physical prototyping through integrated design, test, and analysis , 2006, UIST.

[34]  J. McGrath Methodology matters: doing research in the behavioral and social sciences , 1995 .

[35]  Brad A. Myers,et al.  Past, Present and Future of User Interface Software Tools , 2000, TCHI.

[36]  Steve Hodges,et al.  .NET Gadgeteer: A Platform for Custom Devices , 2012, Pervasive.

[37]  Sebastian Boring,et al.  Gradual engagement: facilitating information exchange between digital devices as a function of proximity , 2012, ITS.

[38]  Brad A. Myers Scripting graphical applications by demonstration , 1998, CHI.

[39]  Jan Meskens,et al.  D-Macs: building multi-device user interfaces by demonstrating, sharing and replaying design actions , 2010, UIST '10.

[40]  Saul Greenberg,et al.  Evaluation Strategies for HCI Toolkit Research , 2018, CHI.

[41]  Ken Hinckley,et al.  Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.

[42]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[43]  Robert Penner Robert Penner's Programming Macromedia Flash MX , 2002 .

[44]  Jan Meskens,et al.  Shortening user interface design iterations through realtime visualisation of design actions on the target device , 2009, 2009 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC).

[45]  Jan Meskens,et al.  Gummy for multi-platform user interface designs: shape me, multiply me, fix me, use me , 2008, AVI '08.

[46]  Sebastian Boring,et al.  Proxemic-Aware Controls: Designing Remote Controls for Ubiquitous Computing Ecologies , 2015, MobileHCI.

[47]  P. C. Malshe What is this interaction? , 1994, The Journal of the Association of Physicians of India.

[48]  Nicolai Marquardt,et al.  WatchConnect: A Toolkit for Prototyping Smartwatch-Centric Cross-Device Applications , 2015, CHI.

[49]  Desney S. Tan,et al.  WinCuts: manipulating arbitrary window regions for more effective use of screen space , 2004, CHI EA '04.

[50]  Sara Jones,et al.  Crossed Wires: Investigating the Problems of End-User Developers in a Physical Computing Task , 2016, CHI.

[51]  Kris Luyten,et al.  The design of slow-motion feedback , 2014, Conference on Designing Interactive Systems.

[52]  Rubaiat Habib Kazi,et al.  Motion Amplifiers: Sketching Dynamic Illustrations Using the Principles of 2D Animation , 2016, CHI.

[53]  John R. Levine,et al.  The Remote Framebuffer Protocol , 2011, RFC.

[54]  Björn Hartmann,et al.  Gaining design insight through interaction prototyping tools , 2009 .

[55]  Michel Beaudouin-Lafon,et al.  Montage: A Video Prototyping System to Reduce Re-Shooting and Increase Re-Usability , 2018, UIST.

[56]  Pierre Dragicevic,et al.  Video browsing by direct manipulation , 2008, CHI.

[57]  Pierre Dragicevic,et al.  Support for input adaptability in the ICON toolkit , 2004, ICMI '04.

[58]  Björn Hartmann,et al.  Sauron: embedded single-camera sensing of printed physical user interfaces , 2013, UIST.

[59]  Dan R. Olsen,et al.  Evaluating user interface systems research , 2007, UIST.

[60]  Randall B. Smith,et al.  Directness and liveness in the morphic user interface construction environment , 1995, UIST '95.

[61]  Saul Greenberg,et al.  Phidgets: easy development of physical interfaces through physical widgets , 2001, UIST '01.