Enlarging a Smartphone with AR to Create a Handheld VESAD (Virtually Extended Screen-Aligned Display)

We investigate using augmented reality to extend the screen of a smartphone beyond its physical limits with a virtual surface that is co-planar with the phone and that follows as the phone is moved. We call this extension a VESAD, or Virtually Extended Screen-Aligned Display. We illustrate and describe several ways that a VESAD could be used to complement the physical screen of a phone, and describe two novel interaction techniques: one where the user performs a quick rotation of the phone to switch the information shown in the VESAD, and another called "slide-and-hang" whereby the user can detach a VESAD and leave it hanging in mid-air, using the phone to establish the initial position and orientation of the virtual window. We also report an experiment that compared three interfaces used for an abstract classification task: the first using only a smartphone, the second using the phone for input but with a VESAD for output, and the third where the user performed input in mid-air on the VESAD (as detected by a Leap Motion). The second user interface was found to be superior in time and selection count (a metric of mistakes committed by users) and was also subjectively preferred over the other two interfaces. This demonstrates the added value of a VESAD for output over a phone's physical screen, and also demonstrates that input on the phone's screen was better than input in mid-air in our experiment.

[1]  Luca Chittaro,et al.  On the effectiveness of Overview+Detail visualization on mobile devices , 2013, Personal and Ubiquitous Computing.

[2]  Raimund Dachselt,et al.  Exploring Time-dependent Scientific Data Using Spatially Aware Mobiles and Large Displays , 2016, ISS.

[3]  Patrick Baudisch,et al.  Keeping things in context: a comparative evaluation of focus plus context screens, overviews, and zooming , 2002, CHI.

[4]  Mark Billinghurst,et al.  Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[5]  Benjamin B. Bederson,et al.  A review of overview+detail, zooming, and focus+context interfaces , 2009, CSUR.

[6]  Pourang Irani,et al.  Ethereal planes: a design framework for 2D information space in 3D mixed reality environments , 2014, SUI.

[7]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[8]  Robert Xiao,et al.  Augmenting the Field-of-View of Head-Mounted Displays with Sparse Peripheral Displays , 2016, CHI.

[9]  Sean White,et al.  Interaction and presentation techniques for shake menus in tangible augmented reality , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[10]  Tobias Isenberg,et al.  Hybrid Tactile/Tangible Interaction for 3D Data Exploration , 2017, IEEE Transactions on Visualization and Computer Graphics.

[11]  Wendy E. Mackay,et al.  Effects of display size and navigation type on a classification task , 2014, CHI.

[12]  Susanne Boll,et al.  EyeSee360: designing a visualization technique for out-of-view objects in head-mounted augmented reality , 2017, SUI.

[13]  Shumin Zhai,et al.  The influence of muscle groups on performance of multiple degree-of-freedom input , 1996, CHI.

[14]  Simon J. Julier,et al.  Presence and discernability in conventional and non-photorealistic immersive augmented reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[15]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[16]  qcMIZCV QoKNQ,et al.  Increased Display Size and Resolution Improve Task Performance in Information-Rich Virtual Environments , 2006 .

[17]  Karthik Ramani,et al.  Modeling Cumulative Arm Fatigue in Mid-Air Interaction based on Perceived Exertion and Kinetics of Arm Motion , 2017, CHI.

[18]  Xiaojun Bi,et al.  Comparing usage of a large high-resolution display to single or dual desktop displays for daily work , 2009, CHI.

[19]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[20]  Mary Czerwinski,et al.  Toward Characterizing the Productivity Benefits of Very Large Displays , 2003, INTERACT.

[21]  Alex Olwal,et al.  WatchThru: Expanding Smartwatch Displays with Mid-air Visuals and Wrist-worn Augmented Reality , 2017, CHI.

[22]  Gabriel Robles-De-La-Torre,et al.  The importance of the sense of touch in virtual and real environments , 2006, IEEE MultiMedia.

[23]  Antonio Krüger,et al.  SurfacePhone: a mobile projection device for single- and multiuser everywhere tabletop interaction , 2014, CHI.

[24]  Pourang Irani,et al.  Consumed endurance: a metric to quantify arm fatigue of mid-air interactions , 2014, CHI.

[25]  Andrew W. Fitzgibbon,et al.  Efficient and precise interactive hand tracking through joint, continuous optimization of pose and correspondences , 2016, ACM Trans. Graph..

[26]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[27]  Chris North,et al.  Beyond visual acuity: the perceptual scalability of information visualizations for large displays , 2007, CHI.

[28]  Eyal Ofek,et al.  IllumiRoom: peripheral projected illusions for interactive experiences , 2013, SIGGRAPH '13.

[29]  Valeria Herskovic,et al.  SidebARs: improving awareness of off-screen elements in mobile augmented reality , 2013, ChileCHI '13.

[30]  Carl Gutwin,et al.  Peripheral Popout: The Influence of Visual Angle and Stimulus Intensity on Popout Effects , 2017, CHI.

[31]  Alina Hang,et al.  Projector phone: a study of using mobile phones with integrated projector for interaction with maps , 2008, Mobile HCI.

[32]  Pourang Irani,et al.  The personal cockpit: a spatial interface for effective task switching on head-worn displays , 2014, CHI.

[33]  Shwetak N. Patel,et al.  SideSwipe: detecting in-air gestures around mobile devices using actual GSM signal , 2014, UIST.

[34]  Pierre Dragicevic,et al.  Fair Statistical Communication in HCI , 2016 .

[35]  Eyal Ofek,et al.  FoveAR: Combining an Optically See-Through Near-Eye Display with Projector-Based Spatial Augmented Reality , 2015, UIST.

[36]  Luca Chittaro,et al.  Visualizing references to off-screen content on mobile devices: A comparison of Arrows, Wedge, and Overview + Detail , 2011, Interact. Comput..

[37]  Woontack Woo,et al.  Graphical Menus Using a Mobile Phone for Wearable AR Systems , 2011, 2011 International Symposium on Ubiquitous Virtual Reality.

[38]  Dieter Schmalstieg,et al.  MultiFi: Multi Fidelity Interaction with Displays On and Around the Body , 2015, CHI.

[39]  David A. Forsyth,et al.  Around device interaction for multiscale navigation , 2012, Mobile HCI.

[40]  Mark Rosenfield,et al.  Font Size and Viewing Distance of Handheld Smart Phones , 2011, Optometry and vision science : official publication of the American Academy of Optometry.

[41]  Robert Xiao,et al.  Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation , 2014, MobileHCI '14.

[42]  Christopher Andrews,et al.  Space to think: large high-resolution displays for sensemaking , 2010, CHI.

[43]  Shumin Zhai,et al.  View size and pointing difficulty in multi-scale navigation , 2004, AVI.

[44]  Xing-Dong Yang,et al.  Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments , 2015, MobileHCI.

[45]  Kiyoshi Kiyokawa,et al.  Analysing the effects of a wide field of view augmented reality display on search performance in divided attention tasks , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).