Point clouds indexing in real time motion capture

Today's human-computer interaction techniques are often gesture-inspired and thus pushed towards naturalness and immediateness. Their implementation requires non-invasive tracking systems which can work with little or no body attached devices, like wireless optical motion capture. These technologies present a recurrent problem, which is to keep a coherent indexing for the different captured points during real time tracking. The inability to constantly distinguish tracked points limits interaction naturalness and design possibilities. In this paper we present a real time algorithm capable of dealing with points indexing matter. Compared to other solutions, the presented research adds a computed indexing correction to keep coherent indexing throughout the tracking session. The correction is applied automatically by the system, whenever a specific configuration is detected. Our solution works with an arbitrary number of points and it was primarily designed for fingertips tracking. A Virtual Reality application was developed in order to exploit the algorithm functionalities while testing its behavior and effectiveness. The application provides a virtual stereoscopic, user-centric environment in which the user can trigger simple interactions by reaching virtual objects with his/her fingertips.

[1]  François Bérard,et al.  Bare-hand human-computer interaction , 2001, PUI '01.

[2]  François Bérard,et al.  The Magic Table: Computer-Vision Based Augmentation of a Whiteboard for Creative Meetings , 2003 .

[3]  Grenoble Cede,et al.  Visual Tracking of Bare Fingers for Interactive Surfaces , 2004 .

[4]  Paul A. Beardsley,et al.  Computer Vision for Interactive Computer Graphics , 1998, IEEE Computer Graphics and Applications.

[5]  Thomas B. Moeslund,et al.  A Survey of Computer Vision-Based Human Motion Capture , 2001, Comput. Vis. Image Underst..

[6]  Carolina Cruz-Neira,et al.  Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE , 2023 .

[7]  Dieter Schmalstieg,et al.  Finger tracking for interaction in augmented environments , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.

[8]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Darwin G. Caldwell,et al.  Passive hand pose recognition in virtual reality , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[10]  M. Tarr,et al.  Virtual reality in behavioral neuroscience and beyond , 2002, Nature Neuroscience.

[11]  Johnny Chung Lee,et al.  Hacking the Nintendo Wii Remote , 2008, IEEE Pervasive Computing.

[12]  H. M. Schepers,et al.  Ambulatory assessment of human body kinematics and kinetics , 2009 .

[13]  João Manuel R. S. Tavares,et al.  Human movement tracking and analysis with Kalman filtering and global optimization techniques , 2005 .

[14]  Thomas B. Moeslund Interacting with a Virtual World Through Motion Capture , 2001 .

[15]  Adrian Hilton,et al.  A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..