Gazing into the future

(continued next page) Dealing with nature’s complexities has always pushed scientists and engineers to search for new paradigms and tools to advance their understanding. In recent years, researchers in multiple disciplines have been working to create computerbased models of large-scale natural processes that are predictive of future events. The creation of such models depends on understanding processes in small areas in great detail, so that data can be accurately extrapolated to large landscapes. Doing so is the vision of a major NSF (National Science Foundation) initiative to establish a series of national centers that focus on Earth’s natural elements and ecosystems. These digital field observatories, if they materialize, could thrust science and education in new directions. CGRER’s members have been especially active in efforts to establish a center that focuses on water-related processes in natural and disturbed landscapes. To understand this concept in more detail, travel back to around 2000, when a nationwide group of research hydrologists – scientists and engineers who study the flow of water as it cycles above, over, and underneath Earth’s surface – met to discuss mechanisms for pushing their

[1]  John Paulin Hansen,et al.  Command Without a Click: Dwell Time Typing by Mouse and Gaze Selections , 2003, INTERACT.

[2]  Aulikki Hyrskykari,et al.  Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid , 2006 .

[3]  Fabio Paternò,et al.  Automatic Support for Usability Evaluation , 1998, IEEE Trans. Software Eng..

[4]  I. Scott MacKenzie,et al.  KSPC (Keystrokes per Character) as a Characteristic of Text Entry Techniques , 2002, Mobile HCI.

[5]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[6]  Päivi Majaranta,et al.  From Gaze Control to Attentive Interfaces , 2005 .

[7]  Zhiwei Guan,et al.  The validity of the stimulated retrospective think-aloud method as measured by eye tracking , 2006, CHI.

[8]  Dan Witzner Hansen,et al.  Eye tracking in the wild , 2005, Comput. Vis. Image Underst..

[9]  R. Barnes Motion and time study , 1950 .

[10]  N. Iwahashi,et al.  A method for the coupling of belief systems through human-robot language interaction , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[11]  J. P. Hansen The use of eye mark recordings to support verbal retrospection in software testing , 1991 .

[12]  Geoffrey M. Underwood Eye fixations on pictures of natural scenes: Getting the gist and identifying the components , 2005 .

[13]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[14]  John Paulin Hansen,et al.  Gaze typing compared with input by head and hand , 2004, ETRA.

[15]  Samuel Kaski,et al.  Combining eye movements and collaborative filtering for proactive information retrieval , 2005, SIGIR '05.

[16]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[17]  Rafael Cabeza,et al.  Eye tracking: Pupil orientation geometrical modeling , 2006, Image Vis. Comput..

[18]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[19]  Craig A. Grimes,et al.  Encyclopedia of Sensors , 2006 .

[20]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[21]  Roel Vertegaal,et al.  ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.

[22]  Janni Nielsen,et al.  Getting access to what goes on in people's heads?: reflections on the think-aloud technique , 2002, NordiCHI '02.

[23]  Ted Boren,et al.  Thinking aloud: reconciling theory and practice , 2000 .

[24]  John Paulin Hansen,et al.  A comparative usability study of two Japanese gaze typing systems , 2006, ETRA.

[25]  Susan M. Dray,et al.  Remote possibilities?: international usability testing at a distance , 2004, INTR.

[26]  I. Scott MacKenzie,et al.  Measuring errors in text entry tasks: an application of the Levenshtein string distance statistic , 2001, CHI Extended Abstracts.

[27]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[28]  Howell O. Istance Communication through eye-gaze: where we have been, where we are now and where we can go from here , 2006, ETRA '06.

[29]  Jakob Nielsen,et al.  Usability engineering , 1997, The Computer Science and Engineering Handbook.

[30]  Peter Jackson,et al.  Natural Language Processing for Online Applications: Text Retrieval, Extraction & Categorization , 2002 .

[31]  E. Reed The Ecological Approach to Visual Perception , 1989 .

[32]  Marti A. Hearst,et al.  The state of the art in automating usability evaluation of user interfaces , 2001, CSUR.

[33]  Jeffrey S. Shell,et al.  Eye contact sensing glasses for attention-sensitive wearable video blogging , 2004, CHI EA '04.

[34]  Jun Gong,et al.  A new error metric for text entry method evaluation , 2006, CHI.

[35]  Terry C. Lansdown,et al.  The mind's eye: cognitive and applied aspects of eye movement research , 2005 .

[36]  M. W. van Someren,et al.  The think aloud method: a practical approach to modelling cognitive processes , 1994 .

[37]  Päivi Majaranta,et al.  Effects of feedback on eye typing with a short dwell time , 2004, ETRA.

[38]  K. A. Ericsson,et al.  Protocol Analysis: Verbal Reports as Data , 1984 .

[39]  Fangmin Shi,et al.  Vision responsive technology to assist people with limited mobility , 2006 .

[40]  Roel Vertegaal,et al.  Designing attentive interfaces , 2002, ETRA.

[41]  Zahid Hussain Digital image processing - practical applications of parallel processing techniques , 1991, Ellis Horwood series in digital and signal processing.