The long-range goal of our research is to create an intelligent assistant for interactive scientific data visualization via both sight and sound. There are a variety of computer-human interface (CHI) issues that are unique to our approach to interactive visualization. It is upon these issues that we focus here. In this paper, we: (1) describe the approach to interactive visualization taken by the project which is the context of our work; (2) specify the CHI issues that are peculiar to this approach; (3) summarize the current capabilities of our workstation for performing human factors experiments; (4) describe the research plan we have developed for learning how to provide a user with intelligent assistance for dealing with those issues; (5) present a representative pilot study that has contributed useful information; (6) summarize the results of our pilot studies; and (7) discuss the direction of our future work. We do not claim to be solving the general case of how to provide intelligent assistance for scientific visualization. We do, however, expect that the progress we make in one visualization environment will contribute to understanding of the general case.
[1]
Edward S. Yeung,et al.
Pattern recognition by audio representation of multivariate analytical data
,
1980
.
[2]
Robert C. Morrison,et al.
High technology laboratory aids for visually handicapped chemistry students
,
1981
.
[3]
R. L. Butterfield,et al.
Multispectral analysis of magnetic resonance images.
,
1985,
Radiology.
[4]
Georges G. Grinstein,et al.
Exvis: an exploratory visualization environment
,
1989
.
[5]
B. H. McCormick,et al.
Visualization in scientific computing
,
1995
.
[6]
J. Mezrich,et al.
Dynamic Representation of Multivariate Time Series Data
,
1984
.
[7]
Sara Bly,et al.
Presenting information in sound
,
1982,
CHI '82.