Adding haptics and sound to spatial curriculum

A large number of learners with disabilities (e.g., blindness, learning disabilities) rely heavily on touch and tactile manipulation to take in information. This information or mode of control is not available in distance education delivery systems to date. The purpose of the research was to explore the expression of spatial concepts such as geography using several non-visual modalities including haptics, 3D real world sounds, and speech, and to determine the optimal assignment of the available modalities to different types of information. The ultimate goal is to integrate these modalities into curriculum delivered at a distance and in the classroom, thereby benefiting students with and without disabilities.