Cognitive Models of the Effect of Audio Cueing on Attentional Shifts in a Complex Multimodal, Dual-Display Dual Task Derek Brock (brock@itd.nrl.navy.mil) and Brian McClimens (mcclimen@itd.nrl.navy.mil) Naval Research Laboratory, 4555 Overlook Ave., S.W. Washington, DC 20375 USA Anthony Hornof (hornof@cs.uoregon.edu) and Tim Halverson (thalvers@cs.uoregon.edu) Department of Computer and Information Science, 1202 University of Oregon Eugene, OR 97403-1202 USA Abstract A comparative cognitive model of two manipulations of a complex dual task in which 3D audio cueing was used to improve operator performance is presented. The model is implemented within the EPIC cognitive architecture and describes extensions that were necessary to simulate gaze shifts and the allocation of attention between separated task displays. A simulation of meta-cognitive decision-making to explain unprompted, volitional shifts of attention and the effect of audio cueing on performance and the frequency of attention shifts are explored. Keywords: cognitive modeling; EPIC; dual task; 3D auditory cueing; separated task displays; gaze shifts; volitional shifts of attention; simulated meta-cognitive decision-making; gamma distribution; sense of timing Introduction and Background System designers take numerous approaches to reduce the number of workstation operators necessary to accomplish complex decision-making tasks in Navy command-and- control centers. Approaches include (a) the automation of multiple tasks and (b) the adoption of supervisory rather than direct control. As workstation operators are asked to manage an increasing number of tasks, reliable techniques are needed to manage operator attention. The research presented here demonstrates how cognitive modeling can explain how operators manage conflicting attention demands and also how cognitive modeling can inform the design of human-machine interfaces that facilitate efficient and accurate multi-display, multi-task execution. The Navy has developed a prototype decision support workstation that features three flat-panel monitors centered in a 135° arc in front of the user (Osga, 2000). With this configuration, the operator can access much data but loses peripheral access to all three monitors when his or her gaze is turned to look at either the right or left screen. This loss can reduce the speed and accuracy of critical decisions (Brock et al., 2002; Brock et al., 2004). The Naval Research Laboratory (NRL) is developing techniques for directing attention in complex operational settings using three-dimensional (3D) or “spatialized” sound (Begault, 1994). Properly designed 3D sounds can be used to convey a variety of task-related information, including the onset, location, and identity of critical events. Brock et al. (2004) demonstrated that the use of 3D sound can significantly improve dual-task performance. The research presented here describes recent cognitive modeling work that has been done to explain the effects of audio cueing observed by Brock et al., and the effect of audio cueing on the allocation of attention between tasks. The models are based on the human data observed by Brock et al. in (a) the “no sound” condition and (b) one particular sound condition (the “screen-centric” condition). Cognitive modeling is a research practice that endeavors to build computer programs that behave in some way like human beings. The models presented here are implemented within the EPIC (Executive Process-Interactive Control) cognitive architecture (Kieras and Meyer, 1997), which is a computational framework for building models of human performance based on the constraints of human perceptual, cognitive, and motor processing. The cognitive modeling presented in this paper specifically explores (a) a simulation of meta-cognitive decision-making to explain the volitional shifts of attention, (b) performance aspects of task-related audio cueing, a somewhat new domain for cognitive modeling, and (c) extensions to the perceptual- motor components of EPIC that were necessary to simulate a complex dual-display task. The Attention Management Study The Dual Task Figure 1 shows the physical layout of the dual task modeled in this paper. The task is from Brock et al. (2004). Participants used a joystick to continuously track an evasive target on the right and, at the same time, used the keyboard to periodically assess and classify “blips” moving down the radar screen on the left. The right task is “tracking” and the left task is “tactical.” The task displays were separated by 90° of arc, such that the unattended display could not be seen with peripheral vision. The task was originally developed by Ballas, Heitmeyer & Perez (1992) and is analogous in many ways to the level of multitask activity that future Navy workstation operators will be subjected to.
[1]
Durand R. Begault,et al.
3-D Sound for Virtual Reality and Multimedia Cambridge
,
1994
.
[2]
James A. Ballas,et al.
The Design of Mixed Use Virtual Auditory Displays: Recent Findings with a Dual-Task Paradigm
,
2004,
ICAD.
[3]
O. Mimura.
[Eye movements].
,
1992,
Nippon Ganka Gakkai zasshi.
[4]
Derek Brock,et al.
Effects of 3D Auditory Display on Dual Task Performance in a Simulated Multiscreen Watchstation Environment
,
2002
.
[5]
D. Whitteridge.
Movements of the eyes
R. H. S. Carpenter, Pion Ltd, London (1977), 420 pp., $27.00
,
1979,
Neuroscience.
[6]
David E. Kieras,et al.
Computational Models for the Effects of Localized Sound Cuing in a Complex Dual Task
,
2001
.
[7]
D P Munoz,et al.
The Influence of Auditory and Visual Distractors on Human Orienting Gaze Shifts
,
1996,
The Journal of Neuroscience.
[8]
Derek Brock,et al.
Using an auditory display to manage attention in a dual task, multiscreen environment
,
2002
.
[9]
Glenn A. Osga.
21st Century Workstations - Active Partners in Accomplishing Task Goals
,
2000
.
[10]
Manuel A. Pérez-Quiñones,et al.
Evaluating two aspects of direct manipulation in advanced cockpits
,
1992,
CHI.
[11]
David E. Kieras,et al.
An Overview of the EPIC Architecture for Cognition and Performance With Application to Human-Computer Interaction
,
1997,
Hum. Comput. Interact..