Wearables as Context for Guiard-abiding Bimanual Touch

We explore the contextual details afforded by wearable devices to support multi-user, direct-touch interaction on electronic whiteboards in a way that-unlike previous work-can be fully consistent with natural bimanual-asymmetric interaction as set forth by Guiard. Our work offers the following key observation. While Guiard's framework has been widely applied in HCI, for bimanual interfaces where each hand interacts via direct touch, subtle limitations of multi-touch technologies as well as limitations in conception and design-mean that the resulting interfaces often cannot fully adhere to Guiard's principles even if they want to. The interactions are fundamentally ambiguous because the system does not know which hand, left or right, contributes each touch. But by integrating additional context from wearable devices, our system can identify which user is touching, as well as distinguish what hand they use to do so. This enables our prototypes to respect lateral preference the assignment of natural roles to each hand as advocated by Guiard in a way that has not been articulated before.

[1]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[2]  Patrick Baudisch,et al.  Fiberio: a touchscreen that senses fingerprints , 2013, UIST.

[3]  Olivier Bau,et al.  Arpège: learning multitouch chord gestures vocabularies , 2013, ITS.

[4]  Terry Winograd,et al.  Fluid interaction with high-resolution wall-size displays , 2001, UIST '01.

[5]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[6]  Carmelo Ardito,et al.  Interaction with Large Displays , 2015, ACM Comput. Surv..

[7]  Ravin Balakrishnan,et al.  Symmetric bimanual interaction , 2000, CHI.

[8]  Fabrice Matulic,et al.  Pen and touch gestural environment for document editing on interactive tabletops , 2013, ITS.

[9]  Fabrice Matulic,et al.  Sensing techniques for tablet+stylus interaction , 2014, UIST.

[10]  Takeo Igarashi,et al.  Flatland: new dimensions in office whiteboards , 1999, CHI '99.

[11]  Hirotaka Osawa,et al.  iRing: intelligent ring using infrared reflection , 2012, UIST.

[12]  Bill Buxton,et al.  Sketching User Experiences: Getting the Design Right and the Right Design , 2007 .

[13]  Dunja Mladenic,et al.  MTi: A method for user identification for multitouch displays , 2013, Int. J. Hum. Comput. Stud..

[14]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[15]  Philip L. Davidson,et al.  Extending 2D object arrangement with pressure-sensitive layering cues , 2008, UIST '08.

[16]  M. Sheelagh T. Carpendale,et al.  Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits , 2011, CHI.

[17]  Dennis Wixon,et al.  The Natural User Interface , 2011 .

[18]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[19]  Thomas P. Moran,et al.  Tivoli: an electronic whiteboard for informal workgroup meetings , 1993, INTERCHI.

[20]  Xiang 'Anthony' Chen,et al.  Duet: exploring joint interactions on a smart phone and a smart watch , 2014, CHI.

[21]  Stéphane Chatty,et al.  Extending a graphical toolkit for two-handed interaction , 1994, UIST '94.

[22]  Charles L. A. Clarke,et al.  symSpline: symmetric two-handed spline manipulation , 2006, CHI.

[23]  Patrick Olivier,et al.  Expressy: Using a Wrist-worn Inertial Measurement Unit to Add Expressiveness to Touch-based Interactions , 2016, CHI.

[24]  Andruid Kerne,et al.  High-performance pen + touch modality interactions: a real-time strategy game eSports context , 2012, UIST.

[25]  Adam M. Fass,et al.  MessyDesk and MessyBoard: two designs inspired by the goal of improving human memory , 2002, DIS '02.

[26]  Ken Hinckley,et al.  LightRing: always-available 2D input on any surface , 2014, UIST.

[27]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[28]  Daniel Vogel,et al.  Territoriality and behaviour on and around large vertical publicly-shared displays , 2012, DIS '12.

[29]  Andreas Paepcke,et al.  Cooperative gestures: multi-user gestural interactions for co-located groupware , 2006, CHI.

[30]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[31]  Daniel Vogel,et al.  Conté: multimodal input inspired by an artist's crayon , 2011, UIST.

[32]  Mahsan Rofouei,et al.  Your phone or mine?: fusing body, touch and device sensing for multi-user device-display interaction , 2012, CHI.

[33]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[34]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST.

[35]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[36]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[37]  William Buxton,et al.  Contextual Animation of Gestural Commands , 1994, Comput. Graph. Forum.

[38]  Andrés Lucero,et al.  Framing, aligning, paradoxing, abstracting, and directing: how design mood boards work , 2012, DIS '12.

[39]  William Buxton,et al.  Large Displays in Automotive Design , 2000, IEEE Computer Graphics and Applications.

[40]  Mark Guzdial,et al.  Software-Realized Scaffolding to Facilitate Programming for Science Learning , 1994, Interact. Learn. Environ..

[41]  Saul Greenberg,et al.  What caused that touch?: expressive interaction with a surface through fiduciary-tagged gloves , 2010, ITS '10.

[42]  Dimitre Novatchev,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues - Response , 1986, IFIP Congress.

[43]  Dzmitry Aliakseyeu,et al.  Funky wall: presenting mood boards using gesture, speech and visuals , 2008, AVI '08.

[44]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.

[45]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[46]  Karen Holtzblatt,et al.  Contextual design , 1997, INTR.

[47]  Xing-Dong Yang,et al.  See me, see you: a lightweight method for discriminating user touches on tabletop displays , 2012, CHI.

[48]  Olivier Bau,et al.  OctoPocus: a dynamic guide for learning gesture-based command sets , 2008, UIST '08.

[49]  Bernd Fröhlich,et al.  Finger and hand detection for multi-touch interfaces based on maximally stable extremal regions , 2012, ITS '12.

[50]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[51]  Shumin Zhai,et al.  Manual and cognitive benefits of two-handed input: an experimental study , 1998, TCHI.

[52]  Deana McDonagh-Philp,et al.  Problem Interpretation and Resolution via Visual Stimuli: The Use of ‘Mood Boards’ in Design Education , 2001 .

[53]  Nicolai Marquardt,et al.  Proxemic interactions: the new ubicomp? , 2011, INTR.

[54]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[55]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[56]  Koji Yatani,et al.  The 1line keyboard: a QWERTY layout in a single line , 2011, UIST '11.

[57]  Patrick Olivier,et al.  Using IMUs to Identify Supervisors on Touch Devices , 2015, INTERACT.

[58]  David Kotz,et al.  ZEBRA: Zero-Effort Bilateral Recurring Authentication , 2014, IEEE Symposium on Security and Privacy.

[59]  Desney S. Tan,et al.  Tumble! Splat! helping users access and manipulate occluded content in 2D drawings , 2006, AVI '06.

[60]  Wendy Ju,et al.  Range: exploring implicit interaction through electronic whiteboard design , 2008, CSCW.

[61]  Johannes Schöning,et al.  Carpus: a non-intrusive user identification technique for interactive surfaces , 2012, UIST '12.