Age and Gestural Differences in the Ease of Rotating a Virtual 3D Image on a Large, Multi-Touch Screen

Providing a natural mapping between multi-touch gestures and manipulations of digital content is important for user-friendly interfaces. Although there are some guidelines for 2D digital content available in the literature, a guideline for manipulation of 3D content has yet to be developed. In this research, two sets of gestures were developed for experiments in the ease of manipulating 3D content on a touchscreen. As there typically are large differences between age groups in the ease of learning new interfaces, we compared a group of adults with a group of children. Each person carried out three tasks linked to rotating the digital model of a green turtle to inspect major characteristics of its body. Task completion time, subjective evaluations, and gesture changing frequency were measured. Results showed that using the conventional gestures for 2D object rotation was not appropriate in the 3D environment. Gestures that required multiple touch points hampered the real-time visibility of rotational effects on a large screen. While the cumulative effects of 3D rotations became complicated after intensive operations, simpler gestures facilitated the mapping between 2D control movements and 3D content displays. For rotation in Cartesian coordinates, moving one fingertip horizontally or vertically on a 2D touchscreen corresponded to the rotation angles of two axes for 3D content, while the relative movement between two fingertips was used to control the rotation angle of the third axis. Based on behavior analysis, adults and children differed in the diversity of gesture types and in the touch points with respect to the object's contours. Offering a robust mechanism for gestural inputs is necessary for universal control of such a system.

[1]  J. Peters,et al.  Introducing touchscreens to black and ethnic minority groups--a report of processes and issues in the Three Cities project. , 2003, Health information and libraries journal.

[2]  G Breinholt,et al.  Evaluation of key shapes on a touchscreen simulation of a specialised keypad. , 1996, Applied ergonomics.

[3]  Sylvie Gibet,et al.  Gesture-Based Communication in Human-Computer Interaction , 2001, Lecture Notes in Computer Science.

[4]  Masahiro Takatsuka,et al.  Estimating virtual touchscreen for fingertip interaction with large displays , 2006, OZCHI '06.

[5]  William Ribarsky,et al.  Towards the establishment of a framework for intuitive multi-touch interaction design , 2012, AVI.

[6]  Donghun Lee,et al.  Effect of key size and activation area on the performance of a regional error correction method in a touch-screen QWERTY keyboard , 2009 .

[7]  Orit Shaer,et al.  Enhancing genomic learning through tabletop interaction , 2011, CHI.

[8]  Arthur D. Fisk,et al.  Touch a Screen or Turn a Knob: Choosing the Best Device for the Job , 2005, Hum. Factors.

[9]  Laurent Grisoni,et al.  The effect of DOF separation in 3D manipulation tasks with multi-touch displays , 2010, VRST '10.

[10]  M. Sheelagh T. Carpendale,et al.  Rotation and translation mechanisms for tabletop interaction , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[11]  Ben Shneiderman,et al.  High Precision Touchscreens: Design Strategies and Comparisons with a Mouse , 1991, Int. J. Man Mach. Stud..

[12]  Jonathan Randall Howarth,et al.  Cultural similarities and differences in user-defined gestures for touchscreen user interfaces , 2010, CHI EA '10.

[13]  Philippe A. Palanque,et al.  People and Computers XVII — Designing for Society , 2004, Springer London.

[14]  Hongbo Fu,et al.  Multitouch Gestures for Constrained Transformation of 3D Objects , 2012, Comput. Graph. Forum.

[15]  Laurent Grisoni,et al.  Integrality and Separability of Multitouch Interaction Techniques in 3D Manipulation Tasks , 2012, IEEE Transactions on Visualization and Computer Graphics.

[16]  Martin Hachet,et al.  Navidget for 3D interaction: Camera positioning and further uses , 2009, Int. J. Hum. Comput. Stud..

[17]  Andy Yeh Two Turns Must Take Turns: Primary School Students' Cognition about 3D Rotation in a Virtual Reality Learning Environment , 2004 .

[18]  Hsin-His Lai,et al.  Factors influencing the usability of icons in the LCD touchscreen , 2008, Displays.

[19]  Dan Saffer Designing gestural interfaces , 2009 .

[20]  Peter M. A. Sloot,et al.  Exploring individual user differences in the 2D/3D interaction with medical image data , 2010, Virtual Reality.

[21]  Andrea Sanna,et al.  Multi-touch user interface evaluation for 3D object manipulation on mobile devices , 2009, Journal on Multimodal User Interfaces.

[22]  Masahiro Hori,et al.  Mobile devices as multi-DOF controllers , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[23]  T. Isenberg,et al.  Exploring One-and Two-Touch Interaction for 3 D Scientific Visualization Spaces , 2009 .

[24]  Sung H. Han,et al.  Touch key design for one-handed thumb interaction with a mobile phone: Effects of touch key size and touch key location , 2010 .

[25]  W. B. Lyons,et al.  Multi-touch interfaces in museum spaces: reporting preliminary findings on the nature of interaction , 2011 .

[26]  Toward Comparing the Touchscreen Interaction Patterns of Kids and Adults , 2012 .

[27]  Dean P Chiang,et al.  On the highway measures of driver glance behavior with an example automobile navigation system. , 2004, Applied ergonomics.

[28]  Dooroo Kim,et al.  Evaluation and integration of a wireless touchscreen into a bridge crane control system , 2007, 2007 IEEE/ASME international conference on advanced intelligent mechatronics.

[29]  Brad A. Myers,et al.  The performance of hand postures in front- and back-of-device interaction for mobile computing , 2008, Int. J. Hum. Comput. Stud..

[30]  Marc Herrlich,et al.  Integrated Rotation and Translation for 3D Manipulation on Multi-Touch Interactive Surfaces , 2011, Smart Graphics.

[31]  Sriram Subramanian,et al.  Supporting Atomic User Actions on the Table , 2010, Tabletops.

[32]  James L. Mohler,et al.  Improving Spatial Ability with Mentored Sketching. , 2008 .

[33]  Klaus H. Hinrichs,et al.  A multi-touch enabled human-transporter metaphor for virtual 3D traveling , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[34]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[35]  Juan Pablo Hourcade,et al.  Evaluating one handed thumb tapping on mobile touchscreen devices , 2008, Graphics Interface.

[36]  M. Sheelagh T. Carpendale,et al.  Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits , 2011, CHI.

[37]  Wendy A. Rogers,et al.  Selection and Design of Input Devices for Assistive Technologies , 2006, 2006 9th International Conference on Control, Automation, Robotics and Vision.

[38]  Antonis A. Argyros,et al.  Design and Development of Four Prototype Interactive Edutainment Exhibits for Museums , 2011, HCI.

[39]  Benjamin B. Bederson,et al.  One-handed touchscreen input for legacy applications , 2008, CHI.

[40]  Benjamin B. Bederson,et al.  Target size study for one-handed thumb use on small touchscreen devices , 2006, Mobile HCI.

[41]  Maxime Collomb,et al.  Extending drag-and-drop to new interactive environments: A multi-display, multi-instrument and multi-user approach , 2008, Interact. Comput..

[42]  Kent Lyons,et al.  An investigation into round touchscreen wristwatch interaction , 2008, Mobile HCI.

[43]  Mikael B. Skov,et al.  You can touch, but you can't look: interacting with in-vehicle systems , 2008, CHI.

[44]  Hiroaki Nishino,et al.  An Electronic Voting System for Haptic Touchscreen Interface , 2010, 2010 International Conference on Complex, Intelligent and Software Intensive Systems.

[45]  Tim Paek,et al.  Usability guided key-target resizing for soft keyboards , 2010, IUI '10.

[46]  J. Galen Buckwalter,et al.  Sex differences in mental rotation and spatial rotation in a virtual environment , 2004, Neuropsychologia.

[47]  Angela Schwering,et al.  Usability Testing of the Interaction of Novices with a Multi-touch Table in Semi Public Space , 2011, HCI.

[48]  John Karat,et al.  A Comparison of Menu Selection Techniques: Touch Panel, Mouse and Keyboard , 1986, Int. J. Man Mach. Stud..

[49]  A. Pentland,et al.  Understanding "honest signals" in business , 2008 .

[50]  Norman Alm,et al.  Using a touch screen computer to support relationships between people with dementia and caregivers , 2010, Interact. Comput..

[51]  Shamus P. Smith,et al.  Relative and absolute mappings for rotating remote 3D objects on multi-touch tabletops , 2011, BCS HCI.

[52]  Meredith Ringel Morris,et al.  Informing the Design of Direct-Touch Tabletops , 2006, IEEE Computer Graphics and Applications.

[53]  Roope Raisamo,et al.  Methods for Presenting Braille Characters on a Mobile Device with a Touchscreen and Tactile Feedback , 2009, IEEE Transactions on Haptics.

[54]  James L. Mohler,et al.  A Review of Spatial Ability Research. , 2008 .

[55]  Andreas Holzinger,et al.  Finger Instead of Mouse: Touch Screens as a Means of Enhancing Universal Access , 2002, User Interfaces for All.

[56]  Elena Mainardi Design of a portable touchscreen interface for powerline domotic systems , 2008, 2008 IEEE International Conference on Automation Science and Engineering.

[57]  Brian D. Fisher,et al.  Mouse and touchscreen selection in the upper and lower visual fields , 2004, CHI.

[58]  Nuno Correia,et al.  A multi-touch tabletop for robust multimedia interaction in museums , 2010, ITS '10.

[59]  Mountaz Hascoët,et al.  Throwing Models for Large Displays , 2008 .