Perceptual saliency in the visual channel enhances explicit language processing

Language processing is underpinned by working memory and while working memory for signed languages has been shown to display some of the characteristics of working memory for speech-based languages, there are a range of anomalous effects related to the inherently visuospatial modality of signed languages. On the basis of these effects, four research questions were addressed in a series of studies:1. Are differences in working memory storage for sign and speech reflected in neural representation?2. Do the neural networks supporting speech-sign switching during a working memory task reflect executive or semantic processes?3. Is working memory for sign language enhanced by a spatial style of information presentation?4. Do the neural networks supporting word reversal indicate tongue-twisting or mind-twisting?The results of the studies showed that:1. Working memory for sign and speech is supported by a combination of modality-specific and nonmodality-specific neural networks.2. Switching between sign and speech during a working memory task is supported by semantic rather than executive processes.3. Working memory performance in educationally promoted native deaf signers is enhanced by a spatial style of presentation.4. Word reversal is a matter of mind-twisting, rather than tongue-twisting.These findings indicate that working memory for sign and speech has modality-specific components as well as nonmodality-specific components. Modality-specific aspects can be explained in terms of Wilson’s (2001) sensorimotor account, which is based on the component model (Baddeley, 2000), given that the functionality of the visuospatial sketchpad is extended to include language processing. Nonmodality-specific working memory processing is predicted by Ronnberg’s (2003) model of cognitive involvement in language processing. However, the modality-free, cross-modal and extra-modal aspects of working memory processing revealed in the present work can be explained in terms of the central executive and the episodic buffer, providing the functionality and neural representation of the episodic buffer are extended.A functional ontology is presented which ties cognitive processes to their neural representation, along with a model explaining modality-specific findings relating to sign language cognition. Predictions of the ontology and the model are discussed in relation to future work.