Magicscroll: a rollable display device with flexible screen real estate and gestural input

We present MagicScroll, a rollable tablet with 2 concatenated flexible multitouch displays, actuated scrollwheels and gestural input. When rolled up, MagicScroll can be used as a rolodex, smartphone, expressive messaging interface or gestural controller. When extended, it provides full access to its 7.5" high-resolution multitouch display, providing the display functionality of a tablet device. We believe that the cylindrical shape in the rolled-up configuration facilitates gestural interaction, while its shape changing and input capabilities allow the navigation of continuous information streams and provide focus plus context functionality. We investigated the gestural affordances of MagicScroll in its rolled-up configuration by means of an elicitation study.

[1]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[2]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[3]  Patrick Baudisch,et al.  Focus plus context screens: combining display technology with visualization techniques , 2001, UIST '01.

[4]  Sean White,et al.  Facet: a multi-segment wrist worn system , 2012, UIST.

[5]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[6]  Xiang Cao,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2003, UIST '03.

[7]  Edward Cutrell,et al.  Quantitative analysis of scrolling techniques , 2002, CHI.

[8]  Hiroshi Ishii,et al.  Linked-Stick: Conveying a Physical Experience using a Shape-Shifting Stick , 2015, CHI Extended Abstracts.

[9]  Clément Pillias,et al.  Reading with a digital roll , 2013, CHI Extended Abstracts.

[10]  Jun Rekimoto,et al.  Tilting operations for small screen interfaces , 1996, UIST '96.

[11]  Christine Reid,et al.  The Myth of the Paperless Office , 2003, J. Documentation.

[12]  Jürgen Steimle,et al.  FoldMe: interacting with double-sided foldable displays , 2012, Tangible and Embedded Interaction.

[13]  Stephen A. Brewster,et al.  Multi-moji: Combining Thermal, Vibrotactile & Visual Stimuli to Expand the Affective Range of Feedback , 2017, CHI.

[14]  Ravin Balakrishnan,et al.  Codex: a dual screen tablet computer , 2009, CHI.

[15]  Hiroshi Ishii,et al.  Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration , 2014, UIST.

[16]  Roel Vertegaal,et al.  PaperTab: an electronic paper computer with multiple large flexible electrophoretic displays , 2013, CHI Extended Abstracts.

[17]  D. Sutherland,et al.  On the road. , 1996, Nursing times.

[18]  Roel Vertegaal,et al.  PaperFold: Evaluating Shape Changes for Viewport Transformations in Foldable Thin-Film Display Devices , 2015, Tangible and Embedded Interaction.

[19]  Xu Jia,et al.  How users manipulate deformable displays as input devices , 2010, CHI.

[20]  Steven A. Shafer,et al.  XWand: UI for intelligent spaces , 2003, CHI '03.

[21]  Max Mühlhäuser,et al.  Xpaaand: interaction techniques for rollable displays , 2011, CHI.

[22]  Kenton O'Hara,et al.  Supporting memory for spatial location while reading from small displays , 1999, CHI EA '99.

[23]  Marja Salmimaa,et al.  Reading experience with curved hand‐held displays , 2008 .

[24]  Max V. Mathews The radio baton and conductor program, or, pitch, the most important and least expressive part of music , 1991 .

[25]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[26]  Andreas Butz,et al.  TUISTER: a tangible UI for hierarchical structures , 2004, IUI '04.

[27]  Scott E. Hudson,et al.  Foldable interactive displays , 2008, UIST '08.

[28]  J. F. Bartlett,et al.  Rock 'n' Scroll Is Here to Stay , 2000 .

[29]  Olivier Bau,et al.  D20: interaction with multifaceted display devices , 2006, CHI Extended Abstracts.

[30]  Björn Hartmann,et al.  Midas: fabricating custom capacitive touch sensors to prototype interactive objects , 2012, UIST '12.

[31]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[32]  Majken Kirkegaard Rasmussen,et al.  Shape-changing interfaces: a review of the design space and open research questions , 2012, CHI.

[33]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[34]  Donald A. Norman,et al.  Affordance, conventions, and design , 1999, INTR.

[35]  Hiroshi Ishii,et al.  Design of spatially aware graspable displays , 1997, CHI Extended Abstracts.

[36]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[37]  Jochen Lang,et al.  HaptiCast : A Physically-Based 3 D Game with Haptic Feedback , 2006 .

[38]  Vincent Lévesque,et al.  ReFlex: A Flexible Smartphone with Active Haptic Feedback for Bend Input , 2016, TEI.

[39]  Nicholas Chen,et al.  Navigation techniques for dual-display e-book readers , 2008, CHI.

[40]  James A. Landay,et al.  Drone & me: an exploration into natural human-drone interaction , 2015, UbiComp.

[41]  Nikolaus F. Troje,et al.  Paper windows: interaction techniques for digital paper , 2005, CHI.

[42]  Ken Perlin,et al.  The UnMousePad: an interpolating multi-touch force-sensing input pad , 2009, SIGGRAPH 2009.

[43]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.