Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor

Head-mounted Mixed Reality (MR) systems enable touch interaction on any physical surface. However, optical methods (i.e., with cameras on the headset) have difficulty in determining the touch contact accurately. We show that a finger ring with Inertial Measurement Unit (IMU) can substantially improve the accuracy of contact sensing from 84.74% to 98.61% (f1 score), with a low latency of 10 ms. We tested different ring wearing positions and tapping postures (e.g., with different fingers and parts). Results show that an IMU-based ring worn on the proximal phalanx of the index finger can accurately sense touch contact of most usable tapping postures. Participants preferred wearing a ring for better user experience. Our approach can be used in combination with the optical touch sensing to provide robust and low-latency contact detection.

[1]  Feng Wang,et al.  Empirical evaluation for finger input properties in multi-touch interaction , 2009, CHI.

[2]  Dong Wei,et al.  MTMR: A conceptual interior design framework integrating Mixed Reality with the Multi-Touch tabletop interface , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[3]  Daniel J. Wigdor,et al.  Designing for low-latency direct-touch input , 2012, UIST.

[4]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[5]  Robert Xiao,et al.  MRTouch: Adding Touch Input to Head-Mounted Mixed Reality , 2018, IEEE Transactions on Visualization and Computer Graphics.

[6]  Geehyuk Lee,et al.  Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures , 2011, Mobile HCI.

[7]  Eric C. Larson,et al.  Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras , 2012, 2012 IEEE International Conference on Emerging Signal Processing Applications.

[8]  A. Agarwal,et al.  High Precision Multi-touch Sensing on Surfaces using Overhead Cameras , 2007, Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07).

[9]  Junsong Yuan,et al.  Hand PointNet: 3D Hand Pose Estimation Using Point Sets , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[10]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[11]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[12]  Geehyuk Lee,et al.  Force gestures: augmented touch screen gestures using normal and tangential force , 2011, CHI Extended Abstracts.

[13]  Yue Liu,et al.  Skeleton-based Dynamic Hand Gesture Recognition using 3D Depth Data , 2018, 3D Image Processing, Measurement , and Applications.

[14]  Sergio Escalera,et al.  Depth-Based 3D Hand Pose Estimation: From Current Achievements to Future Goals , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[15]  Daniel J. Wigdor,et al.  How Much Faster is Fast Enough?: User Perception of Latency & Latency Improvements in Direct and Indirect Touch , 2015, CHI.

[16]  Otmar Hilliges,et al.  Cross-Modal Deep Variational Hand Pose Estimation , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[17]  Julien Letessier,et al.  Visual tracking of bare fingers for interactive surfaces , 2004, UIST '04.

[18]  Joseph A. Paradiso,et al.  Passive acoustic sensing for tracking knocks atop large interactive displays , 2002, Proceedings of IEEE Sensors.

[19]  Tovi Grossman,et al.  The design and evaluation of multitouch marking menus , 2010, CHI.

[20]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[21]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[22]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[23]  Florian Alt,et al.  Improving Accuracy, Applicability and Usability of Keystroke Biometrics on Mobile Touchscreen Devices , 2015, CHI.

[24]  Yunhui Liu,et al.  MIDS: micro input devices system using MEMS sensors , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[25]  Joseph A. Paradiso,et al.  Sensor systems for interactive surfaces , 2000, IBM Syst. J..

[26]  D. Iwai,et al.  Touch sensing by image analysis of fingernail , 2008, 2008 SICE Annual Conference.

[27]  Jun Lee,et al.  AnywhereTouch: Finger Tracking Method on Arbitrary Surface Using Nailed-Mounted IMU for Mobile HMD , 2017, HCI.

[28]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[29]  Andrew Wilson,et al.  MirageTable: freehand interaction on a projected augmented reality tabletop , 2012, CHI.

[30]  Daniel J. Wigdor,et al.  How fast is fast enough?: a study of the effects of latency in direct-touch pointing tasks , 2013, CHI.

[31]  Eric Lecolinet,et al.  MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb , 2009, CHI.

[32]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[33]  Robert Xiao,et al.  Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation , 2014, MobileHCI '14.

[34]  Hang Joon Kim,et al.  Real Time Hand Tracking Based on Active Contour Model , 2005, ICCSA.

[35]  Yoichi Sato,et al.  Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system , 2001, TCHI.

[36]  Sebastian O. H. Madgwick,et al.  An efficient orientation filter for inertial and inertial / magnetic sensor arrays , 2010 .

[37]  Jon Froehlich,et al.  TouchCam , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[38]  Shumin Zhai,et al.  Touch behavior with different postures on soft smartphone keyboards , 2012, Mobile HCI.

[39]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[40]  Jun Rekimoto,et al.  Expressive typing: a new way to sense typing pressure and its applications , 2009, CHI Extended Abstracts.

[41]  Robert Xiao,et al.  DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing , 2016, ISS.

[42]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[43]  Kenneth C. Smith,et al.  A multi-touch three dimensional touch-sensitive tablet , 1985, CHI '85.

[44]  Géry Casiez,et al.  WhichFingers: Identifying Fingers on Touch Surfaces and Keyboards using Vibration Sensors , 2017, UIST.

[45]  Xiang 'Anthony' Chen,et al.  Duet: exploring joint interactions on a smart phone and a smart watch , 2014, CHI.

[46]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[47]  Patrick Baudisch,et al.  Understanding touch , 2011, CHI.

[48]  Robert Xiao,et al.  Estimating 3D Finger Angle on Commodity Touchscreens , 2015, ITS.

[49]  Masatoshi Ishikawa,et al.  Anywhere surface touch: utilizing any surface as an input area , 2014, AH.