SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices

One of the challenges with using mobile touch-screen devices is that they do not provide tactile feedback to the user. Thus, the user is required to look at the screen to interact with these devices. In this paper, we present SemFeel, a tactile feedback system which informs the user about the presence of an object where she touches on the screen and can offer additional semantic information about that item. Through multiple vibration motors that we attached to the backside of a mobile touch-screen device, SemFeel can generate different patterns of vibration, such as ones that flow from right to left or from top to bottom, to help the user interact with a mobile device. Through two user studies, we show that users can distinguish ten different patterns, including linear patterns and a circular pattern, at approximately 90% accuracy, and that SemFeel supports accurate eyes-free interactions.

[1]  Vincent Hayward,et al.  A role for haptics in mobile interaction: initial design using a handheld tactile display prototype , 2006, CHI.

[2]  Patrick Baudisch,et al.  Blindsight: eyes-free access to mobile phones , 2008, CHI.

[3]  Virpi Roto,et al.  Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI , 2005, CHI.

[4]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[5]  Stephen A. Brewster,et al.  Mobile Multi-actuator Tactile Displays , 2007, HAID.

[6]  Barbara Leporini,et al.  Vibrotactile feedback as an orientation aid for blind users of mobile guides , 2008, Mobile HCI.

[7]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[8]  J. Craig,et al.  A comparison of tactile spatial sensitivity on the palm and fingerpad , 2001, Perception & psychophysics.

[9]  Lorna M. Brown,et al.  Tactile feedback for mobile interactions , 2007, CHI.

[10]  Jun Rekimoto,et al.  Ambient touch: designing tactile interfaces for handheld devices , 2002, UIST '02.

[11]  Stephen A. Brewster,et al.  Feeling what you hear: tactile feedback for navigation of audio graphs , 2006, CHI.

[12]  Roope Raisamo,et al.  Methods for Presenting Braille Characters on a Mobile Device with a Touchscreen and Tactile Feedback , 2009, IEEE Transactions on Haptics.

[13]  James D. Hollan,et al.  Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors , 2008, UIST '08.

[14]  Benjamin B. Bederson,et al.  Target size study for one-handed thumb use on small touchscreen devices , 2006, Mobile HCI.

[15]  Toshiaki Sugimura,et al.  Active click: tactile feedback for touch panels , 2001, CHI Extended Abstracts.

[16]  Stephen A. Brewster,et al.  Investigating touchscreen accessibility for people with visual impairments , 2008, NordiCHI.

[17]  Stephen A. Brewster,et al.  Crossmodal congruence: the look, feel and sound of touchscreen widgets , 2008, ICMI '08.

[18]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[19]  Alireza Sahami Shirazi,et al.  Rich Tactile Output on Mobile Devices , 2008, AmI.

[20]  Stephen A. Brewster,et al.  Investigating the effectiveness of tactile feedback for mobile touchscreens , 2008, CHI.

[21]  Lorna M. Brown,et al.  Multidimensional tactons for non-visual information presentation in mobile devices , 2006, Mobile HCI.

[22]  Stephen A. Brewster,et al.  T-Bars: towards tactile user interfaces for mobile touchscreens , 2008, Mobile HCI.