With the development of novel interfaces controlled through multiple modalities, new approaches are needed to analyze the process of interaction with such interfaces and evaluate them at a fine grain of detail. In order to evaluate the usability and usefulness of such interfaces, one needs tools to collect and analyze richly detailed data pertaining to both the process and outcomes of user interaction. Eye tracking is a technology that can provide detailed data on the allocation and shifts of users' visual attention across interface entities. Eye movement data, when combined with data from other input modalities (such as spoken commands, haptic actions with the keyboard and the mouse, etc.), results in just such a rich data on set. However, integrating, analyzing and visualizing multimodal data on user interactions remains a difficult task. In this paper we report on a first step toward developing a suite of tools to facilitate this task. We designed and implemented an Eye Tracking Analysis System that generates combined gaze and action visualizations from eye movement data and interaction logs. This new visualization allows an experimenter to see the visual attention shifts of users interleaved with their actions on each screen of a multi-screen interface. A pilot experiment on comparing two interfaces — a traditional interface and a speech-controlled one — to an educational multimedia application was carried out to test the utility of our tool.
[1]
Robert J. K. Jacob,et al.
Evaluation of eye gaze interaction
,
2000,
CHI.
[2]
Dario D. Salvucci.
Inferring intent in eye-based interfaces: tracing eye movements with process models
,
1999,
CHI '99.
[3]
Dan Schrimpsher,et al.
Extending Eye Tracking to Analyse Interactions with Multimedia Information Presentations
,
2000,
BCS HCI.
[4]
M. Just,et al.
Eye fixations and cognitive processes
,
1976,
Cognitive Psychology.
[5]
Denise C. R. Benel,et al.
Use of an Eyetracking System in the Usability Laboratory
,
1991
.
[6]
John R. Anderson,et al.
Eye tracking the visual search of click-down menus
,
1999,
CHI '99.
[7]
Ron Van Buskirk,et al.
A comparison of speech and mouse/keyboard GUI navigation
,
1995,
CHI '95.
[8]
Aulikki Hyrskykari,et al.
101 spots, or how do users read menus?
,
1998,
CHI.
[9]
Alistair G. Sutcliffe,et al.
Designing effective multimedia presentations
,
1997,
CHI.
[10]
Daniel B. Horn,et al.
Patterns of entry and correction in large vocabulary continuous speech recognition systems
,
1999,
CHI '99.