"It's all about the start" classifying eyes-free mobile authentication techniques

Abstract Mobile device users avoiding observational attacks and coping with situational impairments may employ techniques for eyes-free mobile unlock authentication, where a user enters his/her passcode without looking at the device. This study supplies an initial description of user accuracy in performing this authentication behavior with PIN and pattern passcodes, with varying lengths and visual characteristics. Additionally, we inquire if tactile-only feedback can provide assistive spatialization, finding that orientation cues prior to unlocking do not help. Measurements of edit distance and dynamic time warping accuracy were collected, using a within-group, randomized study of 26 participants. 1021 passcode entry gestures were collected and classified, identifying six user strategies for using the pre-entry tactile feedback, and ten codes for types of events and errors that occurred during entry. We found that users who focused on orienting themselves to position the first digit of the passcode using the tactile feedback performed better in the task. These results could be applied to better define eyes-free behavior in further research, and to design better and more secure methods for eyes-free authentication.

[1]  Patrick Baudisch,et al.  Blindsight: eyes-free access to mobile phones , 2008, CHI.

[2]  Richard E. Ladner,et al.  DigiTaps: eyes-free number entry on touchscreens with minimal audio feedback , 2013, UIST.

[3]  Janne Lindqvist,et al.  Guessing Attacks on User-Generated Gesture Passwords , 2017, IMWUT.

[4]  Jacob O. Wobbrock,et al.  WalkType: using accelerometer data to accomodate situational impairments in mobile touch screen text entry , 2012, CHI.

[5]  Heinrich Hußmann,et al.  Touch me once and i know it's you!: implicit authentication based on touch screen patterns , 2012, CHI.

[6]  Janne Lindqvist,et al.  Engineering Gesture-Based Authentication Systems , 2014, IEEE Pervasive Computing.

[7]  Antti Oulasvirta,et al.  Free-Form Gesture Authentication in the Wild , 2016, CHI.

[8]  Adam J. Aviv,et al.  Is Bigger Better? Comparing User-Generated Passwords on 3x3 vs. 4x4 Grid Sizes for Android's Pattern Unlock , 2015, ACSAC.

[9]  Shiri Azenkot,et al.  Exploring the use of speech input by blind people on mobile devices , 2013, ASSETS.

[10]  Joaquim A. Jorge,et al.  Exploring the Non-Visual Acquisition of Targets on Touch Phones and Tablets , 2011 .

[11]  I. Scott MacKenzie,et al.  Eyes-free text entry with error correction on touchscreen mobile devices , 2010, NordiCHI.

[12]  Leysia Palen,et al.  Chatting with teenagers: Considering the place of chat technologies in teen life , 2006, TCHI.

[13]  Ross J. Anderson,et al.  A Birthday Present Every Eleven Wallets? The Security of Customer-Chosen Banking PINs , 2012, Financial Cryptography.

[14]  I. Scott MacKenzie,et al.  Letterscroll: text entry using a wheel for visually impaired users , 2008, CHI Extended Abstracts.

[15]  Adam J. Aviv,et al.  Performance of Eyes-Free Mobile Authentication , 2018 .

[16]  Ian Oakley,et al.  Designing Eyes-Free Interaction , 2007, HAID.

[17]  Kyle Montague,et al.  Blind People Interacting with Large Touch Surfaces: Strategies for One-handed and Two-handed Exploration , 2015, ITS.

[18]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[19]  Adam J. Aviv,et al.  Towards Baselines for Shoulder Surfing on Mobile Authentication , 2017, ACSAC.

[20]  Richard E. Ladner,et al.  PassChords: secure multi-touch authentication for blind people , 2012, ASSETS '12.

[21]  Daniel Votipka,et al.  User Interactions and Permission Use on Android , 2017, CHI.

[22]  Adam J. Aviv,et al.  Baseline Measurements of Shoulder Surfing Analysis and Comparability for Smartphone Unlock Authentication , 2017, CHI Extended Abstracts.

[23]  Stephen A. Brewster,et al.  Investigating touchscreen accessibility for people with visual impairments , 2008, NordiCHI.

[24]  Heinrich Hußmann,et al.  Easy to Draw, but Hard to Trace?: On the Observability of Grid-based (Un)lock Patterns , 2015, CHI.

[25]  Ian Oakley,et al.  The haptic wheel: design & evaluation of a tactile password system , 2010, CHI EA '10.

[26]  Martin Pielot,et al.  PocketMenu: non-visual menus for touch screen devices , 2012, Mobile HCI.

[27]  Tapio Lokki,et al.  Eyes-free interaction with free-hand gestures and auditory menus , 2012, Int. J. Hum. Comput. Stud..

[28]  Ravi Kuber,et al.  An empirical investigation of the situationally-induced impairments experienced by blind mobile device users , 2016, W4A.

[29]  Meredith Ringel Morris,et al.  Touchplates: low-cost tactile overlays for visually impaired touch screen users , 2013, ASSETS.

[30]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[31]  Heinrich Hußmann,et al.  Vibrapass: secure authentication based on shared lies , 2009, CHI.

[32]  Adam J. Aviv,et al.  Developing and evaluating a gestural and tactile mobile interface to support user authentication , 2016 .

[33]  Babak Naderi,et al.  Magnetic signatures in air for mobile devices , 2012, Mobile HCI.

[34]  Alexander De Luca,et al.  Patterns in the wild: a field study of the usability of pattern and pin-based authentication on mobile devices , 2013, MobileHCI '13.

[35]  Yuanchun Shi,et al.  BlindType: Eyes-Free Text Entry on Handheld Touchpad by Leveraging Thumb's Muscle Memory , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[36]  Gregory D. Abowd,et al.  No-Look Notes: Accessible Eyes-Free Multi-touch Text Entry , 2010, Pervasive.

[37]  Morten Fjeld,et al.  Exploring user motivations for eyes-free interaction on mobile devices , 2012, CHI.