Using Bayes' Theorem for Command Input: Principle, Models, and Applications

Entering commands on touchscreens can be noisy, but existing interfaces commonly adopt deterministic principles for deciding targets and often result in errors. Building on prior research of using Bayes' theorem to handle uncertainty in input, this paper formalized Bayes' theorem as a generic guiding principle for deciding targets in command input (referred to as "BayesianCommand"), developed three models for estimating prior and likelihood probabilities, and carried out experiments to demonstrate the effectiveness of this formalization. More specifically, we applied BayesianCommand to improve the input accuracy of (1) point-and-click and (2) word-gesture command input. Our evaluation showed that applying BayesianCommand reduced errors compared to using deterministic principles (by over 26.9% for point-and-click and by 39.9% for word-gesture command input) or applying the principle partially (by over 28.0% and 24.5%).

[1]  Shumin Zhai,et al.  Touch behavior with different postures on soft smartphone keyboards , 2012, Mobile HCI.

[2]  Jacob O. Wobbrock,et al.  WalkType: using accelerometer data to accomodate situational impairments in mobile touch screen text entry , 2012, CHI.

[3]  Scott E. Hudson,et al.  Monte carlo methods for managing interactive state, action and feedback under uncertainty , 2011, UIST '11.

[4]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[5]  Matthew Chalmers,et al.  A Large-Scale Study of iPhone App Launch Behaviour , 2018, CHI.

[6]  Wendy E. Mackay,et al.  BIGFile: Bayesian Information Gain for Fast File Retrieval , 2018, CHI.

[7]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[8]  Shumin Zhai,et al.  Foundations for designing and evaluating user interfaces based on the crossing paradigm , 2010, TCHI.

[9]  Daniel Vogel,et al.  HotStrokes: Word-Gesture Shortcuts on a Trackpad , 2019, CHI.

[10]  Carl Gutwin,et al.  A predictive model of menu performance , 2007, CHI.

[11]  Ravin Balakrishnan,et al.  Simple vs. compound mark hierarchical marking menus , 2004, UIST '04.

[12]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[13]  Daniel Vogel,et al.  Crossing-based selection with direct touch input , 2014, CHI.

[14]  Gilles Bailly,et al.  Flower menus: a new type of marking menu with large menu breadth, within groups and efficient expert mode memorization , 2008, AVI '08.

[15]  Abigail Sellen,et al.  An Empirical Evaluation of Some Articulatory and Cognitive Aspects of Marking Menus , 1993, Hum. Comput. Interact..

[16]  Yang Li,et al.  Gesture script: recognizing gestures and their structure using rendering scripts and interactively trained parts , 2014, CHI.

[17]  Kun Li,et al.  M3 Gesture Menu: Design and Experimental Analyses of Marking Menus for Touchscreen Mobile Interaction , 2018, CHI.

[18]  Darren Leigh,et al.  Under the table interaction , 2006, UIST.

[19]  Simon Rogers,et al.  FingerCloud: uncertainty and autonomy handover incapacitive sensing , 2010, CHI.

[20]  William Buxton,et al.  User learning and performance with marking menus , 1994, CHI '94.

[21]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[22]  Tony DeRose,et al.  Proton++: a customizable declarative multitouch framework , 2012, UIST.

[23]  Shumin Zhai,et al.  Typing on an Invisible Keyboard , 2018, CHI.

[24]  Patrick Baudisch,et al.  The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.

[25]  Yang Li,et al.  Gesture avatar: a technique for operating mobile user interfaces using gestures , 2011, CHI.

[26]  Gilles Bailly,et al.  Wavelet menus: a stacking metaphor for adapting marking menus to mobile devices , 2009, Mobile HCI.

[27]  Florian Alt,et al.  ProbUI: Generalising Touch Target Representations to Enable Declarative Gesture Definition for Probabilistic GUIs , 2017, CHI.

[28]  Yang Li Gesture search: a tool for fast mobile data access , 2010, UIST '10.

[29]  Patrick Baudisch,et al.  Lucid touch: a see-through mobile device , 2007, UIST.

[30]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[31]  Hyun W. Ka,et al.  Circling interface: An alternative interaction method for on-screen object manipulation , 2013 .

[32]  Jacob O. Wobbrock,et al.  Exploring the design of accessible goal crossing desktop widgets , 2009, CHI Extended Abstracts.

[33]  Tony DeRose,et al.  Proton: multitouch gestures as regular expressions , 2012, CHI.

[34]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[35]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[36]  Shumin Zhai,et al.  Using strokes as command shortcuts: cognitive benefits and toolkit support , 2009, CHI.

[37]  Shumin Zhai,et al.  The word-gesture keyboard: reimagining keyboard interaction , 2012, CACM.

[38]  Shumin Zhai,et al.  i'sFree: Eyes-Free Gesture Typing via a Touch-Enabled Remote Control , 2019, CHI.

[39]  Daniel Vogel,et al.  Pin-and-Cross: A Unimanual Multitouch Technique Combining Static Touches with Crossing Selection , 2015, UIST.

[40]  Shumin Zhai,et al.  FFitts law: modeling finger touch with fitts' law , 2013, CHI.

[41]  Joshua Goodman,et al.  Language modeling for soft keyboards , 2002, IUI '02.

[42]  Mark W. Newman,et al.  Escape: a target selection technique using visually-cued gestures , 2008, CHI.

[43]  Krzysztof Z. Gajos,et al.  Design Space and Evaluation Challenges of Adaptive Graphical User Interfaces , 2009, AI Mag..

[44]  Florian Alt,et al.  TouchML: A Machine Learning Toolkit for Modelling Spatial Touch Targeting Behaviour , 2015, IUI.

[45]  Shumin Zhai,et al.  Predicting Finger-Touch Accuracy Based on the Dual Gaussian Distribution Model , 2016, UIST.

[46]  Tomer Moscovich,et al.  Contact area interaction with sliding widgets , 2009, UIST '09.

[47]  Daniel Vogel,et al.  Hand occlusion on a multi-touch tabletop , 2012, CHI.

[48]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[49]  V. S. Reed,et al.  Pictorial superiority effect. , 1976, Journal of experimental psychology. Human learning and memory.

[50]  Eric Lecolinet,et al.  MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb , 2009, CHI.

[51]  Markus Löchtefeld,et al.  A user-specific machine learning approach for improving touch accuracy on mobile devices , 2012, UIST '12.

[52]  Shumin Zhai,et al.  Command strokes with and without preview: using pen gestures on keyboard for command selection , 2007, CHI.

[53]  Niels Henze,et al.  100,000,000 taps: analysis and improvement of touch performance in the large , 2011, Mobile HCI.

[54]  Nadine Mandran,et al.  Wavelet menus on handheld devices: stacking metaphor for novice mode and eyes-free selection for expert mode , 2010, AVI.

[55]  Yang Li,et al.  Gesture studio: authoring multi-touch interactions through demonstration and declaration , 2013, CHI.

[56]  Yuanchun Shi,et al.  RegionalSliding: enhancing target selection on touchscreen-based mobile devices , 2011, CHI EA '11.

[57]  Xiaojun Bi,et al.  CommandBoard: Creating a General-Purpose Command Gesture Input Space for Soft Keyboard , 2017, UIST.

[58]  Shumin Zhai,et al.  Bayesian touch: a statistical criterion of target selection with finger touch , 2013, UIST.

[59]  Olivier Rioul,et al.  BIGnav: Bayesian Information Gain for Guiding Multiscale Navigation , 2017, CHI.

[60]  Gilles Bailly,et al.  Effects of Frequency Distribution on Linear Menu Performance , 2017, CHI.

[61]  Roderick Murray-Smith,et al.  Nomadic Input on Mobile Devices: The Influence of Touch Input Technique and Walking Speed on Performance and Offset Modeling , 2016, Hum. Comput. Interact..

[62]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[63]  Olivier Chapuis,et al.  MarkPad: Augmenting Touchpads for Command Selection , 2017, CHI.

[64]  Olivier Bau,et al.  OctoPocus: a dynamic guide for learning gesture-based command sets , 2008, UIST '08.

[65]  Yuen Ren Chao,et al.  Human Behavior and the Principle of Least Effort: An Introduction to Human Ecology , 1950 .

[66]  Patrick Baudisch,et al.  Understanding touch , 2011, CHI.

[67]  Steven K. Feiner,et al.  Rubbing and tapping for precise and rapid selection on touch-screen displays , 2008, CHI.

[68]  Pierre Dragicevic,et al.  Strategies for accelerating on-line learning of hotkeys , 2007, CHI.

[69]  Daniel Vogel,et al.  Occlusion-aware interfaces , 2010, CHI.

[70]  John Williamson,et al.  Continuous uncertain interaction , 2006 .

[71]  Scott E. Hudson,et al.  A framework for robust and flexible handling of inputs with uncertainty , 2010, UIST.

[72]  Yang Li,et al.  Gesture On: Enabling Always-On Touch Gestures for Fast Mobile Access from the Device Standby Mode , 2015, CHI.

[73]  Andy Cockburn,et al.  AccessRank: predicting what users will do next , 2012, CHI.

[74]  Stephen R. Ellis,et al.  The Emergence of Zipf's Law: Spontaneous Encoding Optimization by Users of a Command Language , 1986, IEEE Transactions on Systems, Man, and Cybernetics.

[75]  Renaud Blanch,et al.  Semantic pointing: improving target acquisition with control-display ratio adaptation , 2004, CHI.

[76]  Charles Perin,et al.  Crossets: manipulating multiple sliders by crossing , 2015, Graphics Interface.

[77]  Gilles Bailly,et al.  Wave Menus: Improving the Novice Mode of Hierarchical Marking Menus , 2007, INTERACT.