Intelligent Biometric Group Hand Tracking (IBGHT) database for visual hand tracking research and development

With the increase of innovations in vision-based hand gesture interaction system, new techniques and algorithms are being developed by researchers. However, less attention has been paid on the scope of dismantling hand tracking problems. There is also limited publicly available database developed as benchmark data to standardize the research on hand tracking area. For this purpose, we develop a versatile hand gesture tracking database. This database consists of 60 video sequences containing a total of 15,554 RGB color images. The tracking sequences are captured in different situations ranging from an easy indoor scene to extremely high challenging outdoor scenes. Complete with annotated ground truth data, this database is made available on the web for the sake of assisting other researchers in the related fields to test and evaluate their algorithms based on standard benchmark data.

[1]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[2]  S. Mitra,et al.  Gesture Recognition: A Survey , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[3]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[4]  Jie Chi Yang,et al.  Integrating video-capture virtual reality technology into a physically interactive learning environment for English learning , 2010, Comput. Educ..

[5]  Napoleon H. Reyes,et al.  A New 2D Static Hand Gesture Colour Image Dataset for ASL Gestures , 2011 .

[6]  Hermann Ney,et al.  Benchmark Databases for Video-Based Automatic Sign Language Recognition , 2008, LREC.

[7]  Tae-Kyun Kim,et al.  Tensor Canonical Correlation Analysis for Action Classification , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[8]  Chieh-Chih Wang,et al.  Hand posture recognition using adaboost with SIFT for human robot interaction , 2007 .

[9]  R. Dillmann,et al.  Using gesture and speech control for commanding a robot assistant , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[10]  Konrad Tollmar,et al.  Gesture + play: full-body interaction for virtual environments , 2003, CHI Extended Abstracts.

[11]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  H. Stern,et al.  A gesture-based tool for sterile browsing of radiology images. , 2008, Journal of the American Medical Informatics Association : JAMIA.

[13]  Francesco G. B. De Natale,et al.  Hand tracking and trajectory analysis for physical rehabilitation , 2009, 2009 IEEE International Workshop on Multimedia Signal Processing.

[14]  Ayoub Al-Hamadi,et al.  Hand trajectory-based gesture spotting and recognition using HMM , 2009, 2009 16th IEEE International Conference on Image Processing (ICIP).

[15]  Avinash C. Kak,et al.  Purdue RVL-SLLL American Sign Language Database , 2006 .

[16]  Khaled Assaleh,et al.  Recognition of handwritten Arabic alphabet via hand motion tracking , 2009, J. Frankl. Inst..

[17]  Kwang-Seok Hong,et al.  Game interface using hand gesture recognition , 2010, 5th International Conference on Computer Sciences and Convergence Information Technology.

[18]  Shahrel Azmin Suandi,et al.  Hand gesture tracking system using Adaptive Kalman Filter , 2010, 2010 10th International Conference on Intelligent Systems Design and Applications.

[19]  Prabin Kumar Bora,et al.  Trajectory Guided Recognition of Hand Gestures having only Global Motions , 2008 .

[20]  Rainer Stiefelhagen,et al.  Visual recognition of pointing gestures for human-robot interaction , 2007, Image Vis. Comput..

[21]  Seong-Whan Lee,et al.  A full-body gesture database for automatic gesture recognition , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[22]  Ali Karami,et al.  Persian sign language (PSL) recognition using wavelet transform and neural networks , 2011, Expert Syst. Appl..

[23]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[24]  Rafiqul Zaman Khan,et al.  Survey on Gesture Recognition for Hand Image Postures , 2012, Comput. Inf. Sci..

[25]  Narendra Ahuja,et al.  Recognizing hand gesture using motion trajectories , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[26]  Hatice Gunes,et al.  A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[27]  Shahrel Azmin Suandi,et al.  Real Time Hand Tracking System Using Predictive Eigenhand Tracker , 2012 .

[28]  Bastian Leibe,et al.  MIND-WARPING: towards creating a compelling collaborative augmented reality game , 2000, IUI '00.

[29]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[30]  Shan Lu,et al.  Color-based hands tracking system for sign language recognition , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[31]  R. S. Jadon,et al.  A REVIEW OF VISION BASED HAND GESTURES RECOGNITION , 2009 .

[32]  Hiroshi Murase,et al.  A Hilbert warping method for handwriting gesture recognition , 2010, Pattern Recognit..

[33]  Abdolhossein Sarrafzadeh,et al.  A Color Hand Gesture Database for Evaluating and Improving Algorithms on Hand Gesture and Posture Recognition , 2005 .

[34]  Tieniu Tan,et al.  Real-time hand tracking using a mean shift embedded particle filter , 2007, Pattern Recognit..

[35]  Hermann Ney,et al.  Speech recognition techniques for a sign language recognition system , 2007, INTERSPEECH.