A New Method for Interacting with Multi-Window Applications on Large, High Resolution Displays

Physically large display walls can now be constructed using off-the-shelf computer hardware. The high resolution of these displays (e.g., 50 million pixels) means that a large quantity of data can be presented to users, so the displays are well suited to visualization applications. However, current methods of interacting with display walls are somewhat time consuming. We have analyzed how users solve real visualization problems using three desktop applications (XmdvTool, Iris Explorer and Arc View), and used a new taxonomy to classify users’ actions and illustrate the deficiencies of current display wall interaction methods. Following this we designed a novel methodfor interacting with display walls, which aims to let users interact as quickly as when a visualization application is used on a desktop system. Informal feedback gathered from our working prototype shows that interaction is both fast and fluid.

[1]  Chris North,et al.  Move to improve: promoting physical navigation to increase user performance with large displays , 2007, CHI.

[2]  Patrick Baudisch,et al.  Improving drag-and-drop on wall-size displays , 2005, Graphics Interface.

[3]  Douglas J. Gillan,et al.  How does Fitts' law fit pointing and dragging? , 1990, CHI '90.

[4]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[5]  Doug A. Bowman,et al.  Design and evaluation of menu systems for immersive virtual environments , 2001, Proceedings IEEE Virtual Reality 2001.

[6]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[7]  Jock D. Mackinlay,et al.  A Semantic Analysis of the Design Space of Input Devices , 1990, Hum. Comput. Interact..

[8]  Ben Shneiderman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .

[9]  George W. Fitzmaurice,et al.  A remote control interface for large displays , 2004, UIST '04.

[10]  Xiaojun Bi,et al.  uPen: a smart pen-liked device for facilitating interaction on large displays , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[11]  David Ahlström,et al.  Modeling and improving selection in cascading pull-down menus using Fitts' law, the steering law and force fields , 2005, CHI.

[12]  Carl Gutwin,et al.  Bubble radar: efficient pen-based interaction , 2006, AVI '06.

[13]  Terry Winograd,et al.  FlowMenu: combining command, text, and data entry , 2000, UIST '00.

[14]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[15]  Bruce H. Thomas,et al.  Evaluation of three input mechanisms for wearable computers , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[16]  Desney S. Tan,et al.  The large-display user experience , 2005, IEEE Computer Graphics and Applications.

[17]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[18]  Krishna Bharat,et al.  Making computers easier for older adults to use: area cursors and sticky icons , 1997, CHI.

[19]  Masahiro Takatsuka,et al.  Estimating virtual touchscreen for fingertip interaction with large displays , 2006, OZCHI '06.

[20]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[21]  Xiang Cao,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2003, UIST '03.

[22]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[23]  Doug A. Bowman,et al.  Text Input Techniques for Immersive Virtual Environments: An Empirical Comparison , 2002 .