Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard

We present a new type of augmented mechanical keyboard, capable of sensing rich and expressive motion gestures performed both on and directly above the device. Our hardware comprises of low-resolution matrix of infrared (IR) proximity sensors interspersed between the keys of a regular mechanical keyboard. This results in coarse but high frame-rate motion data. We extend a machine learning algorithm, traditionally used for static classification only, to robustly support dynamic, temporal gestures. We propose the use of motion signatures a technique that utilizes pairs of motion history images and a random forest based classifier to robustly recognize a large set of motion gestures on and directly above the keyboard. Our technique achieves a mean per-frame classification accuracy of 75.6% in leave-one-subject-out and 89.9% in half-test/half-training cross-validation. We detail our hardware and gesture recognition algorithm, provide performance and accuracy numbers, and demonstrate a large set of gestures designed to be performed with our device. We conclude with qualitative feedback from users, discussion of limitations and areas for future work.

[1]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[2]  Alex Bateman,et al.  An introduction to hidden Markov models. , 2007, Current protocols in bioinformatics.

[3]  Geehyuk Lee,et al.  ThickPad: a hover-tracking touchpad for a laptop , 2011, UIST '11 Adjunct.

[4]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[5]  Steven Bathiche,et al.  A practical pressure sensitive computer keyboard , 2009, UIST '09.

[6]  Luc Van Gool,et al.  Hough Forests for Object Detection, Tracking, and Action Recognition , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Xiang Cao,et al.  Grips and gestures on a multi-touch pen , 2011, CHI.

[8]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[9]  Claus Bahlmann,et al.  Online handwriting recognition with support vector machines - a kernel approach , 2002, Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition.

[10]  Antonio Criminisi,et al.  Decision Forests for Computer Vision and Medical Image Analysis , 2013, Advances in Computer Vision and Pattern Recognition.

[11]  Geehyuk Lee,et al.  LongPad: a touchpad using the entire area below the keyboard of a laptop computer , 2013, CHI.

[12]  Xiang Cao,et al.  LensMouse: augmenting the mouse with an interactive touch display , 2010, CHI.

[13]  Antonio Torralba,et al.  Ieee Transactions on Pattern Analysis and Machine Intelligence 1 80 Million Tiny Images: a Large Dataset for Non-parametric Object and Scene Recognition , 2022 .

[14]  Geehyuk Lee,et al.  Area gestures for a laptop computer enabled by a hover-tracking touchpad , 2012, APCHI '12.

[15]  Sebastian Nowozin,et al.  Action Points: A Representation for Low-latency Online Human Action Recognition , 2012 .

[16]  Andruid Kerne,et al.  ZeroTouch: an optical multi-touch and free-air interaction architecture , 2012, CHI.

[17]  Jun Rekimoto,et al.  SmartPad: a finger-sensing keypad for mobile interaction , 2003, CHI Extended Abstracts.

[18]  Andreas M. Kunz,et al.  FLATIR: FTIR multi-touch detection on a discrete distributed sensor array , 2009, TEI.

[19]  Lale Akarun,et al.  Hand Pose Estimation and Hand Shape Classification Using Multi-layered Randomized Decision Forests , 2012, ECCV.

[20]  Jun Rekimoto,et al.  PreSense: interaction techniques for finger sensing input devices , 2003, UIST '03.

[21]  Lale Akarun,et al.  Randomized decision forests for static and dynamic hand shape classification , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[22]  James W. Davis,et al.  The Recognition of Human Movement Using Temporal Templates , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[23]  Shumin Zhai,et al.  Command strokes with and without preview: using pen gestures on keyboard for command selection , 2007, CHI.

[24]  Xiang Cao,et al.  Mouse 2.0: multi-touch meets the mouse , 2009, UIST '09.

[25]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[26]  Thomas Läubli,et al.  Touch&Type - a Novel Input Method for Portable Computers , 2003, INTERACT.

[27]  Morten Fjeld,et al.  Touch&Type: a novel pointing device for notebook computers , 2006, NordiCHI '06.

[28]  Jin-Hyung Kim,et al.  An HMM-Based Threshold Model Approach for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[29]  Jovan Popovic,et al.  Real-time hand-tracking with a color glove , 2009, SIGGRAPH '09.

[30]  Alan Hedge,et al.  Multi-Touch: A New Tactile 2-D Gesture Interface for Human-Computer Interaction , 2001 .

[31]  William Buxton,et al.  ThinSight: versatile multi-touch sensing for thin form-factor displays , 2007, UIST.

[32]  François Guimbretière,et al.  FlexAura: a flexible near-surface range sensor , 2012, UIST '12.

[33]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[34]  Florian Block,et al.  Touch-display keyboards: transforming keyboards into interactive surfaces , 2010, CHI.

[35]  Morten Fjeld,et al.  DGTS: Integrated Typing and Pointing , 2009, INTERACT.

[36]  Sriram Subramanian,et al.  Augmenting the mouse with pressure sensitive input , 2007, CHI.

[37]  Ken Perlin,et al.  The UnMousePad: an interpolating multi-touch force-sensing input pad , 2009, SIGGRAPH 2009.