Augmented reality target finding based on tactile cues

This study is based on a user scenario where augmented reality targets could be found by scanning the environment with a mobile device and getting a tactile feedback exactly in the direction of the target. In order to understand how accurately and quickly the targets can be found, we prepared an experiment setup where a sensor-actuator device consisting of orientation tracking hardware and a tactile actuator were used. The targets with widths 5°, 10°, 15°, 20°, and 25° and various distances between each other were rendered in a 90° -wide space successively, and the task of the test participants was to find them as quickly as possible. The experiment consisted of two conditions: the first one provided tactile feedback only when pointing was on the target and the second one included also another cue indicating the proximity of the target. The average target finding time was 1.8 seconds. The closest targets appeared to be not the easiest to find, which was attributed to the adapted scanning velocity causing the missing the closest targets. We also found that our data did not correlate well with Fitts' model, which may have been caused by the non-normal data distribution. After filtering out 30% of the least representative data items, the correlation reached up to 0.71. Overall, the performance between conditions did not differ from each other significantly. The only significant improvement in the performance offered by the close-to-target cue occurred in the tasks where the targets where the furthest from each other.

[1]  Michael Rohs,et al.  Target acquisition with camera phones when used as magic lenses , 2008, CHI.

[2]  M Akamatsu,et al.  Please Scroll down for Article Ergonomics a Comparison of Tactile, Auditory, and Visual Feedback in a Pointing Task Using a Mouse-type Device , 2022 .

[3]  Stephen Brewster,et al.  Gesture Interaction with Spatial Audio Displays: Effects of Target Size and Inter-Target Separation , 2005 .

[4]  Vuokko Lantz,et al.  Perception of dynamic audiotactile feedback to gesture input , 2008, ICMI '08.

[5]  Matt Jones,et al.  Evaluating haptics for information discovery while walking , 2009, BCS HCI.

[6]  Xiang Cao,et al.  Peephole pointing: modeling acquisition of dynamically revealed targets , 2008, CHI.

[7]  Steven Strachan,et al.  GeoPoke: rotational mechanical systems metaphor for embodied geosocial interaction , 2008, NordiCHI.

[8]  Marcelo Knörich Zuffo,et al.  On the usability of gesture interfaces in virtual reality environments , 2005, CLIHC '05.

[9]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[10]  Stephen A. Brewster,et al.  Wrist rotation for interaction in mobile contexts , 2008, Mobile HCI.

[11]  Tim Schwartz,et al.  Three output planning strategies for use in context-aware computing scenarios , 2008 .

[12]  Roope Raisamo,et al.  Evaluating Tactile Feedback in Graphical User Interfaces , 2002 .

[13]  Antonio Krüger,et al.  Robust speech interaction in a mobile environment through the use of multiple and different media input types , 2003, INTERSPEECH.

[14]  Richard D. Gilson,et al.  Vibrotactile Guidance Cues for Target Acquisition , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[15]  Stephen A. Brewster,et al.  Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane , 2006, CHI.

[16]  Steven Strachan,et al.  Show me the way to Monte Carlo: density-based trajectory navigation , 2007, CHI.

[17]  Steven Strachan,et al.  It's a long way to Monte Carlo: probabilistic display in GPS navigation , 2006, Mobile HCI.

[18]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[19]  Tue Haste Andersen A simple movement time model for scrolling , 2005, CHI EA '05.

[20]  Vuokko Lantz,et al.  Dynamic audiotactile feedback in gesture interaction , 2008, Mobile HCI.

[21]  John M. Flach,et al.  Control Theory for Humans: Quantitative Approaches To Modeling Performance , 2002 .