of Advanced Robotic Systems PARLOMA – A Novel Human-Robot Interaction System for Deaf-Blind Remote Communication Regular Paper

Deaf-blindness forces people to live in isolation. At present, there is no existing technological solution enabling two (or many) deaf-blind people to communicate remotely among themselves in tactile Sign Language (t-SL). When resorting to t-SL, deaf-blind people can communicate only with people physically present in the same place, because they are required to reciprocally explore their hands to exchange messages. We present a preliminary version of PARLOMA, a novel system to enable remote communication between deaf-blind persons. It is composed of a low-cost depth sensor as the only input device, paired with a robotic hand as the output device. Essentially, any user can perform hand-shapes in front of the depth sensor. The system is able to recognize a set of hand-shapes that are sent over the web and reproduced by an anthropomorphic robotic hand. PARLOMA can work as a "telephone" for deaf-blind people. Hence, it will dramatically improve the quality of life of deaf-blind persons. PARLOMA has been presented and supported by the main Italian deaf-blind association, Lega del Filo d'Oro. End users are involved in the design phase.

[1]  Alex Meade Dexter--A finger-spelling hand for the deaf-blind , 1987, Proceedings. 1987 IEEE International Conference on Robotics and Automation.

[2]  Takeo Kanade,et al.  DigitEyes: vision-based hand tracking for human-computer interaction , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[3]  Pietro Perona,et al.  Monocular tracking of the human arm in 3D , 1995, Proceedings of IEEE International Conference on Computer Vision.

[4]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[5]  L. Davis,et al.  el-based tracking of humans in action: , 1996 .

[6]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Dorin Comaniciu,et al.  Mean Shift: A Robust Approach Toward Feature Space Analysis , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Stan Sclaroff,et al.  Estimating 3D hand pose from a cluttered image , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[9]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[10]  JongShill Lee,et al.  Hand region extraction and gesture recognition from video stream with complex background through entropy analysis , 2004, The 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[11]  Luc Van Gool,et al.  Smart particle filtering for 3D hand tracking , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[12]  Shadow Robot Company Developments in dextrous hands for advanced robotic applications , 2004, Proceedings World Automation Congress, 2004..

[13]  E. Scilingo,et al.  Strain sensing fabric for hand posture and gesture monitoring. , 2005, IEEE transactions on information technology in biomedicine : a publication of the IEEE Engineering in Medicine and Biology Society.

[14]  Wendy Sandler,et al.  Sign Language and Linguistic Universals: Entering the lexicon: lexicalization, backformation, and cross-modal borrowing , 2006 .

[15]  Patrick J. Flynn,et al.  A survey of approaches and challenges in 3D and multi-modal 3D + 2D face recognition , 2006, Comput. Vis. Image Underst..

[16]  Silvestro Micera,et al.  Design of a cybernetic hand for perception and action , 2006, Biological Cybernetics.

[17]  S. Mitra,et al.  Gesture Recognition: A Survey , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[18]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[19]  Stan Sclaroff,et al.  Translation and scale-invariant gesture recognition in complex scenes , 2008, PETRA '08.

[20]  Jovan Popović,et al.  Real-time hand-tracking with a color glove , 2009, SIGGRAPH 2009.

[21]  Jae Wook Jeon,et al.  Fingertip detection with morphology and geometric calculation , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Nikos Papamarkos,et al.  Hand gesture recognition using a neural network shape fitting technique , 2009, Eng. Appl. Artif. Intell..

[23]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[24]  Jagdish Lal Raheja,et al.  Real-Time Robotic Hand Control Using Hand Gestures , 2010, 2010 Second International Conference on Machine Learning and Computing.

[25]  Heung-Il Suk,et al.  Hand gesture recognition based on dynamic Bayesian network framework , 2010, Pattern Recognit..

[26]  Alessandra Checchetto,et al.  Una varietà molto speciale: la LISt (Lingua dei Segni Italiana tattile). , 2011 .

[27]  Thad Starner,et al.  American sign language recognition with the kinect , 2011, ICMI '11.

[28]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[29]  Lale Akarun,et al.  Real time hand pose estimation using depth sensors , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[30]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[31]  Lale Akarun,et al.  Hand Pose Estimation and Hand Shape Classification Using Multi-layered Randomized Decision Forests , 2012, ECCV.

[32]  Brian Fang,et al.  Robotic Fingerspelling Hand for Deaf-Blind Communication , 2012 .

[33]  Krystof Litomisky Consumer RGB-D Cameras and their Applications , 2012 .

[34]  Raffaello D'Andrea,et al.  Rapyuta: The RoboEarth Cloud Engine , 2013, 2013 IEEE International Conference on Robotics and Automation.

[35]  Fang Yuan,et al.  Static hand gesture recognition based on HOG characters and support vector machines , 2013, 2013 2nd International Symposium on Instrumentation and Measurement, Sensor Network and Automation (IMSNA).

[36]  Raquel Frizera Vassallo,et al.  Human–Robot Interaction and Cooperation Through People Detection and Gesture Recognition , 2013, Journal of Control, Automation and Electrical Systems.

[37]  Tae-Kyun Kim,et al.  Real-Time Articulated Hand Pose Estimation Using Semi-supervised Transductive Regression Forests , 2013, 2013 IEEE International Conference on Computer Vision.

[38]  Li Cheng,et al.  Efficient Hand Pose Estimation from a Single Depth Image , 2013, 2013 IEEE International Conference on Computer Vision.

[39]  Ling Shao,et al.  Enhanced Computer Vision With Microsoft Kinect Sensor: A Review , 2013, IEEE Transactions on Cybernetics.

[40]  Junsong Yuan,et al.  Robust Part-Based Hand Gesture Recognition Using Kinect Sensor , 2013, IEEE Transactions on Multimedia.

[41]  Antti Oulasvirta,et al.  Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data , 2013, 2013 IEEE International Conference on Computer Vision.

[42]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[43]  Ankit Chaudhary,et al.  Intelligent Approaches to interact with Machines using Hand Gesture Recognition in Natural way: A Survey , 2011, ArXiv.

[44]  Chen Qian,et al.  Realtime and Robust Hand Tracking from Depth , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[45]  Luca Citi,et al.  Restoring Natural Sensory Feedback in Real-Time Bidirectional Hand Prostheses , 2014, Science Translational Medicine.

[46]  Christian Cipriani,et al.  Design of Artificial Hands: A Review , 2014, The Human Hand as an Inspiration for Robot Hand Development.

[47]  Markus Grebenstein,et al.  The Awiwi Hand: An Artificial Hand for the DLR Hand Arm System , 2014 .