The feasibility of eyes-free touchscreen keyboard typing
暂无分享,去创建一个
Typing on a touchscreen keyboard is very difficult without being able to see the keyboard. We propose a new approach in which users imagine a Qwerty keyboard somewhere on the device and tap out an entire sentence without any visual reference to the keyboard and without intermediate feedback about the letters or words typed. To demonstrate the feasibility of our approach, we developed an algorithm that decodes blind touchscreen typing with a character error rate of 18.5%. Our decoder currently uses three components: a model of the keyboard topology and tap variability, a point transformation algorithm, and a long-span statistical language model. Our initial results demonstrate that our proposed method provides fast entry rates and promising error rates. On one-third of the sentences, novices' highly noisy input was successfully decoded with no errors.
[1] Richard E. Ladner,et al. Input finger detection for nonvisual touch screen text entry in Perkinput , 2012, Graphics Interface.
[2] Shumin Zhai,et al. Relaxing stylus typing precision by geometric pattern matching , 2005, IUI.
[3] Gregory D. Abowd,et al. An evaluation of BrailleTouch: mobile touchscreen text entry for the visually impaired , 2012, Mobile HCI.
[4] Per Ola Kristensson,et al. A versatile dataset for text entry evaluations based on genuine mobile emails , 2011, Mobile HCI.