Astral: Prototyping Mobile and IoT Interactive Behaviours via Streaming and Input Remapping

We present Astral, a prototyping tool for mobile and Internet of Things interactive behaviours that streams selected desktop display contents onto mobile devices (smartphones and smartwatches) and remaps mobile sensor data into desktop input events (i.e., keyboard and mouse events). Interactive devices such as mobile phones, watches, and smart objects, offer new opportunities for interaction design– yet prototyping their interactive behaviour remains an implementation challenge. Additionally, current tools often focus on systems responding after an action takes place as opposed to while the action takes place. With Astral, designers can rapidly author interactive prototypes live on mobile devices through familiar desktop applications. Designers can also customize input mappings using easing functions to author, fine-tune and assess rich outputs. We demonstrate the expressiveness of Astral through a set of prototyping scenarios with novel and replicated examples from past literature which reflect how the system might support and empower designers throughout the design process. Author

[1]  Jan Meskens,et al.  D-Macs: building multi-device user interfaces by demonstrating, sharing and replaying design actions , 2010, UIST '10.

[2]  Brad A. Myers,et al.  Past, Present and Future of User Interface Software Tools , 2000, TCHI.

[3]  Brad A. Myers,et al.  How designers design and program interactive behaviors , 2008, 2008 IEEE Symposium on Visual Languages and Human-Centric Computing.

[4]  Desney S. Tan,et al.  WinCuts: manipulating arbitrary window regions for more effective use of screen space , 2004, CHI EA '04.

[5]  Rob Miller,et al.  Sikuli: using GUI screenshots for search and automation , 2009, UIST '09.

[6]  Saul Greenberg,et al.  Toolkits and interface creativity , 2007, Multimedia Tools and Applications.

[7]  Rubaiat Habib Kazi,et al.  Kitty: sketching dynamic and interactive illustrations , 2014, UIST.

[8]  Björn Hartmann,et al.  Midas: fabricating custom capacitive touch sensors to prototype interactive objects , 2012, UIST '12.

[9]  Michael S. Bernstein,et al.  Reflective physical prototyping through integrated design, test, and analysis , 2006, UIST.

[10]  L. Holmquist Prototyping: generating ideas or cargo cult designs? , 2005, Interactions.

[11]  Rubaiat Habib Kazi,et al.  SKUID: sketching dynamic drawings using the principles of 2D animation , 2016, SIGGRAPH Talks.

[12]  Saul Greenberg,et al.  Evaluation Strategies for HCI Toolkit Research , 2018, CHI.

[13]  Olivier Chapuis,et al.  User interface façades: towards fully adaptable user interfaces , 2006, UIST.

[14]  Kris Luyten,et al.  The design of slow-motion feedback , 2014, Conference on Designing Interactive Systems.

[15]  Brad A. Myers Mobile Devices for Control , 2002, Mobile HCI.

[16]  Jan Meskens,et al.  Gummy for multi-platform user interface designs: shape me, multiply me, fix me, use me , 2008, AVI '08.

[17]  Ken Hinckley,et al.  Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.

[18]  Luís Carriço,et al.  A mixed-fidelity prototyping tool for mobile devices , 2008, AVI '08.

[19]  Sara Jones,et al.  Crossed Wires: Investigating the Problems of End-User Developers in a Physical Computing Task , 2016, CHI.

[20]  Jan Meskens,et al.  Shortening user interface design iterations through realtime visualisation of design actions on the target device , 2009, 2009 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC).

[21]  Jeffrey Nichols,et al.  Interacting at a Distance Using Semantic Snarfing , 2001, UbiComp.

[22]  Steven K. Feiner,et al.  Virtual projection: exploring optical projection as a metaphor for multi-device interaction , 2012, CHI.

[23]  Andreas Butz,et al.  Touch projector: mobile interaction through video , 2010, CHI.

[24]  Wendy E. Mackay,et al.  Design Breakdowns: Designer-Developer Gaps in Representing and Interpreting Interactive Systems , 2017, CSCW.

[25]  Björn Hartmann,et al.  Sauron: embedded single-camera sensing of printed physical user interfaces , 2013, UIST.

[26]  Tovi Grossman,et al.  Pineal: Bringing Passive Objects to Life with Embedded Mobile Devices , 2017, CHI.

[27]  Scott R. Klemmer,et al.  Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition , 2007, CHI.

[28]  M. Sheelagh T. Carpendale,et al.  Transmogrification: causal manipulation of visualizations , 2013, UIST.

[29]  Morgan Dixon,et al.  Prefab layers and prefab annotations: extensible pixel-based interpretation of graphical interfaces , 2014, UIST.

[30]  Robert Penner Robert Penner's Programming Macromedia Flash MX , 2002 .

[31]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[32]  Dan R. Olsen,et al.  Evaluating user interface systems research , 2007, UIST.

[33]  Steve Hodges,et al.  .NET Gadgeteer: A Platform for Custom Devices , 2012, Pervasive.

[34]  Yang Li,et al.  Deep shot: a framework for migrating tasks across devices using mobile phone cameras , 2011, CHI.

[35]  Saul Greenberg,et al.  Phidgets: easy development of physical interfaces through physical widgets , 2001, UIST '01.

[36]  David Shaw,et al.  Makey Makey: improvising tangible and nature-based user interfaces , 2012, TEI.