Development of a Talking Tactile Tablet

Researchers have long understood the value of tactile presentation of pictures, maps and diagrams for readers who are blind or otherwise visually impaired (Edman, 1992). However, some practicalities have always limited the usefulness and appeal of these materials. It is often difficult for a blind individual to make sense of tactile shapes and textures without some extra information to confirm or augment what has been touched (Kennedy, Tobias & Nichols, 1991). Labeling a drawing with Braille is one way to accomplish this, but since Braille tags must be large and have plenty of blank space around them to be legible, they are not ideal for use with fairly complex or graphically rich images. Also, reliance on Braille labeling restricts the usefulness of tactile graphics to those blind or visually impaired persons who are competent Braille readers, a lamentably small population. In order to enrich the tactile graphic experience and to allow for a broader range of users, products like NOMAD have been created. NOMAD was developed by Dr. Don Parkes of the University of New South Wales, and was first brought to market in 1989 (Parkes, 1994). When connected to a computer, NOMAD promised to enhance the tactile experience by allowing a user to view pictures, graphs, diagrams etc, and then to press on various features to hear descriptions, labels and other explanatory audio material. Further, the NOMAD aspired to offer the kind of multi-media, interactive experiences that have exploded on the scene in visual computing. Several factors, however, have always prevented the NOMAD system from being widely adopted. Resolution of the touch-sensitive surface is low, so precise correspondence of graphic images and audio tags is difficult to achieve. Speech is synthetically produced, so it is not as clear and "user friendly" as pre-recorded human speech that is common in mainstream CD ROMs. Perhaps most importantly, high quality interactive programming and tactile media was not independently produced in sufficient quantities to justify the hardware investment, as was envisioned by the NOMAD creators. Given the immense promise of audio-tactile strategies to open interactive learning and entertainment to those whose vision problems preclude their use of a mouse or a video monitor, Touch Graphics was established in 1997. This for-profit company, created in cooperation with Baruch College's Computer Center for Visually Impaired People, has successfully competed for research and development funds through the Small Business Innovation Research programs at the US Department of Education and the National Science Foundation. The work has led, thus far, to the creation of a prototype Talking Tactile Tablet (TTT) and three interactive programs for use with the device. The hardware component of the TTT system is an extremely simple, durable and inexpensive "easel" (fig. 1) for holding tactile graphic sheets motionless against a high-resolution touch-sensitive surface. A user's finger pressure is transmitted through a variety of flexible tactile graphic overlays to this surface, which is a standard hardened-glass touch screen, typically used in conjunction with a video monitor for ATMs and other applications. The TTT device is connected to a host computer by a single cable that is plugged into the USB port on a Macintosh or Windows-based computer. Alternatively, a completely free-standing version (fig. 2) has been created, in which the tablet is set into a "docking station" that incorporates a Single Board Computer, a hard disk drive, audio system, and connections to peripheral devices. In both cases, the computer interprets the user's presses on the tactile graphic overlay sheet in exactly the same way that it does when a mouse is clicked while the cursor is over a particular region, icon or other object on a video screen. With appropriate software, the system promises to open the world of "point and click" computing to blind and visually impaired users. …