The Effects of Speech Intelligibility in a Simulated Industrial Process Control Task

Payne, Peters, Birkmire, Bonto, Anastasi, and Wenger (in press, Human Factors) used a dual task paradigm and varied speech intelligibility levels in an auditory task while subjects performed concurrent visual tasks. They found significant cross-modal interference effects but only in tasks that involve the central processes of working memory and decision making. The present research assessed cross-modal interference in a more complex decision making environment. We used the NASA Multi-attribute Task battery, which consists of three visual tasks (choice reaction time, compensatory tracking, complex decision-making) and one auditory communications task. Subjects performed in both single and dual task conditions, with the dual task conditions always including an auditory task in which we varied the intelligibility level (as measured by the Modified Rhymes Test) of the speech signal. Replicating Payne et al. (in press) there was no effect of speech intelligibility on performance in the compensatory tracking task, which does not load heavily on working memory or decision making, but there were effects of speech intelligibility in the reaction time and decision making tasks. These results demonstrate that the same pattern of selective interference effects reported by Payne et al. (in press) can be obtained with different visual tasks and under conditions that more closely mimic the task demands of many real-world situations. Finally, this study repre sents an important conceptual link between our earlier laboratory research (Payne et al., in press) and the simulation research of Whitaker, Peters, and Garinther (1989).