Shape Retrieval and 3D Gestural Interaction

Despite the emerging importance of Virtual Reality and immersive interaction research, no papers on application of 3D shape retrieval to this topic have been presented in recent 3D Object Retrieval workshops. In this paper we discuss how geometric processing and geometric shape retrieval methods could be extremely useful to implement effective natural interaction systems for 3D immersive virtual environments. In particular, we will discuss how the reduction of complex gesture recognition tasks to simple geometric retrieval ones could be useful to solve open issue in gestural interaction. Algorithms for robust point description in trajectories data with learning of inter-subject invariant features could, for example, solve relevant issues of direct manipulation algorithms, and 3D object retrieval methods could be used as well to build dictionaries and implement guidance system to maximize usability of natural gestural interfaces.

[1]  Wenyu Liu,et al.  Bag of contour fragments for robust shape classification , 2014, Pattern Recognit..

[2]  Luis A. Leiva,et al.  Gestures à Go Go , 2015, ACM Trans. Intell. Syst. Technol..

[3]  Jim Austin,et al.  A Machine-Learning Approach to Keypoint Detection and Landmarking on 3D Meshes , 2012, International Journal of Computer Vision.

[4]  mc schraefel,et al.  A Taxonomy of Gestures in Human Computer Interactions , 2005 .

[5]  Sergio Escalera,et al.  Multi-modal gesture recognition challenge 2013: dataset and results , 2013, ICMI '13.

[6]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[7]  Elise van den Hoven,et al.  Crossing the bridge over Norman's Gulf of Execution: revealing feedforward's true identity , 2013, CHI.

[8]  Andrea Tagliasacchi,et al.  Robust Articulated-ICP for Real-Time Hand Tracking , 2015 .

[9]  Michael Rohs,et al.  A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors , 2010, IUI '10.

[10]  Chris North,et al.  Understanding Multi-touch Manipulation for Surface Computing , 2009, INTERACT.

[11]  Réjean Plamondon,et al.  Modelling velocity profiles of rapid movements: a comparative study , 1993, Biological Cybernetics.

[12]  Franziska Meier,et al.  3D Shape Context and Distance Transform for action recognition , 2008, 2008 19th International Conference on Pattern Recognition.

[13]  Lale Akarun,et al.  Real time hand pose estimation using depth sensors , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[14]  Olivier Bau,et al.  OctoPocus: a dynamic guide for learning gesture-based command sets , 2008, UIST '08.

[15]  Céline Coutrix,et al.  Designing guiding systems for gesture-based interaction , 2015, EICS.

[16]  David Lindlbauer,et al.  Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI , 2012 .

[17]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[18]  Guillaume Lavoué,et al.  Combination of bag-of-words descriptors for robust partial shape retrieval , 2012, The Visual Computer.

[19]  Donald A. Norman,et al.  Natural user interfaces are not natural , 2010, INTR.