Real Time Translator for Sign Languages

Sign language is a medium of conversation for physically disabled people. These people communicate via different actions of hands, where each different action means something. This article focuses on removing the barrier of communication between normal and physically disabled people. The article aims to translate the sign language in real time by mobile camera such that it can act as a medium of conversation between normal and deaf or dumb people.

[1]  Jong-Il Park,et al.  Hand shape recognition using distance transform and shape decomposition , 2011, 2011 18th IEEE International Conference on Image Processing.

[2]  Shaun J. Canavan,et al.  Hand Pointing Estimation for Human Computer Interaction Based on Two Orthogonal-Views , 2010, 2010 20th International Conference on Pattern Recognition.

[3]  Luc Van Gool,et al.  Real-time sign language letter and word recognition from depth data , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[4]  Cheng Li,et al.  Pixel-Level Hand Detection in Ego-centric Videos , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[5]  Ulrich Neumann,et al.  Real-time Hand Pose Recognition Using Low-Resolution Depth Images , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[6]  Nicolas Pugeault,et al.  Spelling it out: Real-time ASL fingerspelling recognition , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[7]  Alberto Del Bimbo,et al.  Real-time hand status recognition from RGB-D imagery , 2012, Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012).

[8]  Ayoub Al-Hamadi,et al.  A Robust Method for Hand Gesture Segmentation and Recognition Using Forward Spotting Scheme in Conditional Random Fields , 2010, 2010 20th International Conference on Pattern Recognition.

[9]  Thad Starner,et al.  American sign language recognition with the kinect , 2011, ICMI '11.