Press-n-Paste: Copy-and-Paste Operations with Pressure-sensitive Caret Navigation for Miniaturized Surface in Mobile Augmented Reality

Copy-and-paste operations are the most popular features on computing devices such as desktop computers, smartphones and tablets. However, the copy-and-paste operations are not sufficiently addressed on the Augmented Reality (AR) smartglasses designated for real-time interaction with texts in physical environments. This paper proposes two system solutions, namely Granularity Scrolling (GS) and Two Ends (TE), for the copy-and-paste operations on AR smartglasses. By leveraging a thumb-size button on a touch-sensitive and pressure-sensitive surface, both the multi-step solutions can capture the target texts through indirect manipulation and subsequently enables the copy-and-paste operations. Based on the system solutions, we implemented an experimental prototype named Press-n-Paste (PnP). After the eight-session evaluation capturing 1,296 copy-and-paste operations, 18 participants with GS and TE achieve the peak performance of 17,574 ms and 13,951 ms per copy-and-paste operation, with 93.21% and 98.15% accuracy rates respectively, which are as good as the commercial solutions using direct manipulation on touchscreen devices. The user footprints also show that PnP has a distinctive feature of miniaturized interaction area within 12.65 mm * 14.48 mm. PnP not only proves the feasibility of copy-and-paste operations with the flexibility of various granularities on AR smartglasses, but also gives significant implications to the design space of pressure widgets as well as the input design on smart wearables.

[1]  Wei Tsang Ooi,et al.  BezelCopy: an efficient cross-application copy-paste technique for touchscreen smartphones , 2014, AVI.

[2]  Pan Hui,et al.  TiPoint: detecting fingertip for mid-air interaction on computational resource constrained smartglasses , 2019, UbiComp.

[3]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[4]  Olivier Chapuis,et al.  Copy-and-paste between overlapping windows , 2007, CHI.

[5]  Tong Li,et al.  Quadmetric Optimized Thumb-to-Finger Interaction for Force Assisted One-Handed Text Entry on Mobile Headsets , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[6]  Shuchang Xu,et al.  Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor , 2019, UIST.

[7]  Geehyuk Lee,et al.  Force gestures: augmenting touch screen gestures with normal and tangential forces , 2011, UIST.

[8]  Toshiya Isomoto,et al.  Press & tilt: one-handed text selection and command execution on smartphone , 2018, OZCHI.

[9]  Shiri Azenkot,et al.  Exploring the use of speech input by blind people on mobile devices , 2013, ASSETS.

[10]  Andreas Butz,et al.  What you see is what you touch: visualizing touch screen interaction in the head-up display , 2014, Conference on Designing Interactive Systems.

[11]  Sven Mayer,et al.  Improving the Input Accuracy of Touchscreens using Deep Learning , 2019, CHI Extended Abstracts.

[12]  Kenji Suzuki,et al.  Pressure-sensitive zooming-out interfaces for one-handed mobile interaction , 2018, MobileHCI.

[13]  Zhuying Li,et al.  Enabling finger pointing based text selection on touchscreen mobile devices , 2019 .

[14]  Florian Alt,et al.  Improving Accuracy, Applicability and Usability of Keystroke Biometrics on Mobile Touchscreen Devices , 2015, CHI.

[15]  Heinrich Hußmann,et al.  Novel Indirect Touch Input Techniques Applied to Finger-Forming 3D Models , 2016, AVI.

[16]  David Coyle,et al.  Empirical Evidence for a Diminished Sense of Agency in Speech Interfaces , 2015, CHI.

[17]  Takahiro Hayashi,et al.  Effective gazewriting with support of text copy and paste , 2015, 2015 IEEE/ACIS 14th International Conference on Computer and Information Science (ICIS).

[18]  Andy Cockburn,et al.  Push-Edge and Slide-Edge: Scrolling by Pushing Against the Viewport Edge , 2015, CHI.

[19]  James A. Landay,et al.  Comparing Speech and Keyboard Text Entry for Short Messages in Two Languages on Touchscreen Phones , 2016, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[20]  Carl Gutwin,et al.  Improving Discoverability and Expert Performance in Force-Sensitive Text Selection for Touch Devices with Mode Gauges , 2018, CHI.

[21]  Pan Hui,et al.  M2A: A Framework for Visualizing Information from Mobile Web to Mobile Augmented Reality , 2019, 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom.

[22]  Robert Xiao,et al.  MRTouch: Adding Touch Input to Head-Mounted Mixed Reality , 2018, IEEE Transactions on Visualization and Computer Graphics.

[23]  Tovi Grossman,et al.  BISHARE: Exploring Bidirectional Interactions Between Smartphones and Head-Mounted Augmented Reality , 2020, CHI.

[24]  Jan O. Borchers,et al.  Release, Don't Wait!: Reliable Force Input Confirmation with Quick Release , 2017, ISS.

[25]  Géry Casiez,et al.  ForceEdge: Controlling Autoscroll on Both Desktop and Mobile Computers Using the Force , 2017, CHI.

[26]  Samuel B. Williams,et al.  ASSOCIATION FOR COMPUTING MACHINERY , 2000 .

[27]  Arpit Agarwal,et al.  AutoComPaste: auto-completing text as an alternative to copy-paste , 2012, AVI.

[28]  Steven K. Feiner,et al.  Rubbing and tapping for precise and rapid selection on touch-screen displays , 2008, CHI.

[29]  Motoki Miura,et al.  A Text Selection Technique Using Word Snapping , 2014, KES.

[30]  Tsutomu Terada,et al.  Readability and legibility of fonts considering shakiness of head mounted displays , 2019, UbiComp.

[31]  Jan O. Borchers,et al.  ForceRay: Extending Thumb Reach via Force Input Stabilizes Device Grip for Mobile Touch Input , 2019, CHI.

[32]  Elahe Javadi,et al.  A Shortcut for Caret Positioning on Touch-Screen Phones , 2019, MobileHCI.

[33]  Sachi Mizobuchi,et al.  Making an impression: force-controlled pen input for handheld devices , 2005, CHI Extended Abstracts.

[34]  Yui-Pan Yau How Subtle Can It Get? A Trimodal Study of Ring-sized Interfaces for One-Handed Drone Control , 2020 .

[35]  Chi-Wing Fu,et al.  DualGaze: Addressing the Midas Touch Problem in Gaze Mediated VR Interaction , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[36]  Eva Hornecker,et al.  Beyond affordance: tangibles' hybrid nature , 2012, TEI.

[37]  Yuanchun Shi,et al.  Investigating Gesture Typing for Indirect Touch , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[38]  Pan Hui,et al.  Interaction Methods for Smart Glasses: A Survey , 2017, IEEE Access.

[39]  Pan Hui,et al.  HIBEY: Hide the Keyboard in Augmented Reality , 2019, 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom.

[40]  Daniel J. Wigdor,et al.  How fast is fast enough?: a study of the effects of latency in direct-touch pointing tasks , 2013, CHI.

[41]  Howell O. Istance,et al.  Snap clutch, a moded approach to solving the Midas touch problem , 2008, ETRA.

[42]  Aaron J. Quigley,et al.  Factors influencing visual attention switch in multi-display user interfaces: a survey , 2012, PerDis.

[43]  Dominik Schmidt,et al.  Personal clipboards for individual copy-and-paste on shared multi-user surfaces , 2013, CHI.

[44]  Leah Findlater,et al.  Identifying Speech Input Errors Through Audio-Only Interaction , 2018, CHI.

[45]  Chao Gao,et al.  BASNet: Boundary-Aware Salient Object Detection , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[46]  Qian Wang,et al.  ForceBoard: Subtle Text Entry Leveraging Pressure , 2018, CHI.

[47]  Stephen A. Brewster,et al.  Pressure-based menu selection for mobile devices , 2010, Mobile HCI.

[48]  Ravin Balakrishnan,et al.  Zliding: fluid zooming and sliding for high precision parameter manipulation , 2005, UIST.

[49]  Vittorio Fuccella,et al.  Gestures and widgets: performance in text editing on multi-touch capable mobile devices , 2013, CHI.

[50]  Florian Alt,et al.  Gaze'N'Touch: Enhancing Text Selection on Mobile Devices Using Gaze , 2020, CHI Extended Abstracts.

[51]  Itiro Siio,et al.  Memory stones: an intuitive copy-and-paste method between multi-touch computers , 2013, CHI Extended Abstracts.

[52]  Brad A. Myers,et al.  Citrine: providing intelligent copy-and-paste , 2004, UIST '04.