Making Blind People Autonomous in the Exploration of Tactile Models: A Feasibility Study

Blind people are typically excluded from equal access to the world’s visual culture, thus being often unable to achieve concrete benefits of art education and enjoyment. This is particularly true when dealing with paintings due to their bi-dimensional nature impossible to be explored using the sense of touch. This may be partially overcome by translating paintings into tactile bas-reliefs. However, evidence from recent studies suggests that the mere tactile exploration is often not sufficient to fully understand and enjoy bas-reliefs. The integration of different sensorial stimuli proves to dramatically enrich the haptic exploration. Moreover, granting blind people the possibility of autonomously accessing and enjoying pictorial works of art, is undoubtedly a good strategy to enrich their exploration. Accordingly, the main aim of the present work is to assess the feasibility of a new system consisting of a physical bas-relief, a vision system tracking the blind user’s hands during “exploration” and an audio system providing verbal descriptions. The study, supported by preliminary tests, demonstrates the effectiveness of such an approach capable to transform a frustrating, bewildering and negative experience (i.e. the mere tactile exploration) into one that is liberating, fulfilling, stimulating and fun.

[1]  Monica Carfagni,et al.  Digital Bas-Relief Design: a Novel Shape from Shading-Based Method , 2014 .

[2]  Monica Carfagni,et al.  From 2D to 2.5D i.e. from painting to tactile model , 2014, Graph. Model..

[3]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[4]  Lapo Governi,et al.  Computer-based methodologies for semi-automatic 3D model generation from paintings , 2014, Int. J. Comput. Aided Eng. Technol..

[5]  Lapo Governi,et al.  Tactile exploration of paintings: An interactive procedure for the reconstruction of 2.5D models , 2014, 22nd Mediterranean Conference on Control and Automation.

[6]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[7]  Lapo Governi,et al.  Tactile 3D bas-relief from single-point perspective paintings: a computer based method , 2014 .

[8]  Werner Purgathofer,et al.  High-quality tactile paintings , 2011, JOCCH.

[9]  Shan Lu,et al.  Color-based hands tracking system for sign language recognition , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[10]  David P. Dobkin,et al.  The quickhull algorithm for convex hulls , 1996, TOMS.

[11]  Francisco Javier Díaz Pernas,et al.  A Kinect-based system for cognitive rehabilitation exercises monitoring , 2014, Comput. Methods Programs Biomed..

[12]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[13]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Werner Purgathofer,et al.  Computer-Aided Design of Tactile Models - Taxonomy and Case Studies , 2012, ICCHP.

[15]  David Fofi,et al.  A review of recent range image registration methods with accuracy evaluation , 2007, Image Vis. Comput..

[16]  Derek D. Lichti,et al.  Haptic and gesture-based interactions for manipulating geological datasets , 2011, 2011 IEEE International Conference on Systems, Man, and Cybernetics.

[17]  David J. Fleet,et al.  Model-based hand tracking with texture, shading and self-occlusions , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[18]  Antonis A. Argyros,et al.  Tracking the articulated motion of two strongly interacting hands , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  Simon Hayhoe An enquiry into passive and active exclusion from unreachable artworks in the museum: Two case studies of final-year students at California School for the Blind studying artworks through galleries and on the web , 2014 .

[20]  Song Wang,et al.  Restoration of Brick and Stone Relief from Single Rubbing Images , 2012, IEEE Transactions on Visualization and Computer Graphics.

[21]  Shahzad Malik,et al.  Visual touchpad: a two-handed gestural input device , 2004, ICMI '04.

[22]  Jon Louis Bentley,et al.  Multidimensional binary search trees used for associative searching , 1975, CACM.

[23]  Antonis A. Argyros,et al.  Markerless and Efficient 26-DOF Hand Pose Recovery , 2010, ACCV.

[24]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[25]  Antonis A. Argyros,et al.  Full DOF tracking of a hand interacting with an object by modeling occlusions and physical constraints , 2011, 2011 International Conference on Computer Vision.

[26]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[27]  Jason M. O'Kane,et al.  CHARLIE : An Adaptive Robot Design with Hand and Face Tracking for Use in Autism Therapy , 2011, Int. J. Soc. Robotics.