Investigating Interactions for Text Recognition using a Vibrotactile Wearable Display

Vibrotactile skin-reading uses wearable vibrotactile displays to convey dynamically generated textual information. Such wearable displays have potential to be used in a broad range of applications. Nevertheless, the reading process is passive, and users have no control over the reading flow. To compensate for such drawback, this paper investigates what kind of interactions are necessary for vibrotactile skin reading and the modalities of such interactions. An interaction concept for skin reading was designed by taking into account the reading as a process. We performed a formative study with 22 participants to assess reading behaviour in word and sentence reading using a six-channel wearable vibrotactile display. Our study shows that word based interactions in sentence reading are more often used and preferred by users compared to character-based interactions and that users prefer gesture-based interaction for skin reading. Finally, we discuss how such wearable vibrotactile displays could be extended with sensors that would enable recognition of such gesture-based interaction. This paper contributes a set of guidelines for the design of wearable haptic displays for text communication.

[1]  Marcin Szwed,et al.  Braille in the Sighted: Teaching Tactile Reading to Sighted Adults , 2016, PloS one.

[2]  Keith Rayner,et al.  Eye movements, the perceptual span, and reading speed , 2010, Psychonomic bulletin & review.

[3]  Elisabeth Lex,et al.  A sliding window approach to natural hand gesture recognition using a custom data glove , 2016, 2016 IEEE Symposium on 3D User Interfaces (3DUI).

[4]  Kenneth O. Johnson,et al.  Differences in spatial acuity between digits , 2001, Neurology.

[5]  Karon E. MacLean,et al.  Designing Large Sets of Haptic Icons with Rhythm , 2008, EuroHaptics.

[6]  Yong Wang,et al.  A compact electroactive polymer actuator suitable for refreshable Braille display , 2008 .

[7]  J. M. Cattell THE TIME TAKEN UP BY CEREBRAL OPERATIONS , 1886 .

[8]  Geoffrey M Boynton,et al.  Tactile hyperacuity thresholds correlate with finger maps in primary somatosensory cortex (S1). , 2007, Cerebral cortex.

[9]  Ali Israr,et al.  Tactile display for the visually impaired using TeslaTouch , 2011, CHI Extended Abstracts.

[10]  C M Reed,et al.  Reception of Morse code through motional, vibrotactile, and auditory stimulation , 1997, Perception & psychophysics.

[11]  Michal Romaszewski,et al.  Choosing and Modeling the Hand Gesture Database for a Natural User Interface , 2011, Gesture Workshop.

[12]  Stephen A. Brewster,et al.  Mobile Multi-actuator Tactile Displays , 2007, HAID.

[13]  Nicolas D. Georganas,et al.  Real-Time Hand Gesture Detection and Recognition Using Bag-of-Features and Support Vector Machine Techniques , 2011, IEEE Transactions on Instrumentation and Measurement.

[14]  Piotr Gawron,et al.  Natural hand gestures for human identification in a Human-Computer Interface , 2014, 2014 4th International Conference on Image Processing Theory, Tools and Applications (IPTA).

[15]  Parth H. Pathak,et al.  Demo: Finger and Hand Gesture Recognition using Smartwatch , 2015, MobiSys.

[16]  J H Kirman Tactile perception of computer-derived formant patterns from voiced speech. , 1974, The Journal of the Acoustical Society of America.

[17]  F. A. Geldard Adventures in tactile literacy. , 1957 .

[18]  Deyou Xu A Neural Network Approach for Hand Gesture Recognition in Virtual Reality Driving Training System of SPG , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[19]  Li-Wei Chan,et al.  EdgeVib: Effective Alphanumeric Character Output Using a Wrist-Worn Tactile Display , 2016, UIST.

[20]  S. Millar Reading by touch , 1997 .

[21]  Sanjeev Sofat,et al.  Vision Based Hand Gesture Recognition , 2009 .

[22]  Eduardo E. Veas,et al.  Skin Reading: encoding text in a 6-channel haptic display , 2016, SEMWEB.

[23]  Eduardo E. Veas,et al.  Personalising vibrotactile displays through perceptual sensitivity adjustment , 2017, SEMWEB.

[24]  G. Brindley “Seeing” with Skin , 1973, Nature.

[25]  James C. Bliss,et al.  Optical-to-Tactile Image Conversion for the Blind , 1970 .

[26]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[27]  Robert H. Gault,et al.  Progress in experiments on tactual interpretation of oral speech. , 1924 .

[28]  Mark S. Seidenberg,et al.  PSYCHOLOGICAL SCIENCE IN THE PUBLIC INTEREST HOW PSYCHOLOGICAL SCIENCE INFORMS THE TEACHING OF READING , 2022 .

[29]  Shwetak N. Patel,et al.  Whole-home gesture recognition using wireless signals , 2013, MobiCom.

[30]  D. F. Fisher,et al.  Reading and visual search , 1975, Memory & cognition.

[31]  Kouichi Murakami,et al.  Gesture recognition using recurrent neural networks , 1991, CHI.

[32]  M. Sile O'Modhrain,et al.  Refreshing Refreshable Braille Displays , 2015, IEEE Transactions on Haptics.

[33]  Parth H. Pathak,et al.  Finger-writing with Smartwatch: A Case for Finger and Hand Gesture Recognition using Smartwatch , 2015, HotMobile.

[34]  David M. Eagleman,et al.  Using space and time to encode vibrotactile information: toward an estimate of the skin’s achievable throughput , 2015, Experimental Brain Research.

[35]  Changzhi Li,et al.  Wireless Hand Gesture Recognition Based on Continuous-Wave Doppler Radar Sensors , 2016, IEEE Transactions on Microwave Theory and Techniques.

[36]  Kongqiao Wang,et al.  Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors , 2009, IUI.

[37]  Thomas B. Moeslund,et al.  Real-time recognition of hand alphabet gestures using principal component analysis , 1997 .

[38]  Geehyuk Lee,et al.  Transture: Continuing a Touch Gesture on a Small Screen into the Air , 2015, CHI Extended Abstracts.

[39]  G. M. Reicher Perceptual recognition as a function of meaninfulness of stimulus material. , 1969, Journal of experimental psychology.

[40]  John C Stevens,et al.  Tactile information transfer: a comparison of two stimulation sites. , 2005, The Journal of the Acoustical Society of America.

[41]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[42]  N.D. Georganas,et al.  Real-time Vision-based Hand Gesture Recognition Using Haar-like Features , 2007, 2007 IEEE Instrumentation & Measurement Technology Conference IMTC 2007.

[43]  S. Millar Reading by Touch in Blind Children and Adults , 2004 .

[44]  David P. Quigley,et al.  Passive haptic learning of typing skills facilitated by wearable computers , 2014, CHI Extended Abstracts.

[45]  Johannes Schöning,et al.  WatchMe: A Novel Input Method Combining a Smartwatch and Bimanual Interaction , 2015, CHI Extended Abstracts.

[46]  Pedro Neto,et al.  Real-time and continuous hand gesture spotting: An approach based on artificial neural networks , 2013, 2013 IEEE International Conference on Robotics and Automation.

[47]  Barry Hughes,et al.  Linguistic and perceptual-motor contributions to the kinematic properties of the braille reading finger. , 2011, Human movement science.

[48]  Eduardo E. Veas,et al.  Vibrotactile patterns using sensitivity prioritisation , 2017, SEMWEB.

[49]  Mokhtar M. Hasan,et al.  Hand Gesture Modeling and Recognition using Geometric Features: A Review , 2012 .

[50]  Luís Carriço,et al.  UbiBraille: designing and evaluating a vibrotactile Braille-reading device , 2013, ASSETS.

[51]  M. Daneman,et al.  How reading braille is both like and unlike reading print , 1988, Memory & cognition.

[52]  F. Smith,et al.  Familiarity of configuration vs discriminability of features in the visual identification of words , 1969 .

[53]  Eric Gunther,et al.  Skinscape: A Tool for Composition in the Tactile Modality , 2001 .

[54]  Ivan Poupyrev,et al.  Soli , 2016, ACM Trans. Graph..

[55]  Yang Xu,et al.  WiFinger: talk to your smart devices with finger-grained gesture , 2016, UbiComp.

[56]  Thomas Elbert,et al.  Blind Braille readers mislocate tactile stimuli , 2003, Biological Psychology.