Multimodal feedback: an assessment of performance and mental workload
暂无分享,去创建一个
H S Vitense | J A Jacko | V K Emery | J. Jacko | H. Vitense | V. Emery
[1] David S. Ebert,et al. The integrality of speech in multimodal interfaces , 1998, TCHI.
[2] M. Akamatsu. The influence of combined visual and tactile information on finger and eye movements during shape tracing. , 1992, Ergonomics.
[3] Marilyn Rose McGee. A haptically enhanced scrollbar : Force-Feedback as a means of reducing the problems associated with scrolling , 1999 .
[4] Philip R. Cohen,et al. Synergistic use of direct manipulation and natural language , 1989, CHI '89.
[5] Susan G. Hill,et al. Workload Assessment of a Remotely Piloted Vehicle (RPV) System , 1988 .
[6] J. G. Hollands,et al. Engineering Psychology and Human Performance , 1984 .
[7] Motoyuki Akamatsu,et al. Movement characteristics using a mouse with tactile and force feedback , 1996, Int. J. Hum. Comput. Stud..
[8] Stephen A. Brewster,et al. Enhancing scanning input with non-speech sounds , 1996, Assets '96.
[9] R. Nelson,et al. Reaction times for hand movements made in response to visual versus vibratory cues. , 1990, Somatosensory & motor research.
[10] Alistair D. N. Edwards,et al. An improved auditory interface for the exploration of lists , 1997, MULTIMEDIA '97.
[11] Antonella De Angeli,et al. Integration and synchronization of input modes during multimodal human-computer interaction , 1997, CHI.
[12] Alexander G. Hauptmann,et al. Speech and gestures for graphic image manipulation , 1989, CHI '89.
[13] Sharon L. Oviatt,et al. Multimodal system processing in mobile environments , 2000, UIST '00.
[14] Peter Robinson,et al. Cognitive considerations in the design of multi-modal input systems , 2001, HCI.
[15] Raymond S. Nickerson,et al. Human-Computer Interaction: Background and Issues , 1997 .
[16] Aude Dufresne,et al. Touching and hearing GUI's: design issues for the PC-Access system , 1996, Assets '96.
[17] Minh Tue Vo,et al. Building an application framework for speech and pen input integration in multimodal learning interfaces , 1996, 1996 IEEE International Conference on Acoustics, Speech, and Signal Processing Conference Proceedings.
[18] John M. Carroll,et al. Human-Computer Interaction in the New Millennium , 2001 .
[19] Stephen Brewster. Sonically-enhanced drag and drop , 1998 .
[20] Christopher D. Wickens,et al. Workload Assessment and Prediction , 1990 .
[21] Joëlle Coutaz,et al. A generic platform for addressing the multimodal challenge , 1995, CHI '95.
[22] Louis Rosenberg,et al. Using force feedback to enhance human performance in graphical user interfaces , 1996, CHI 1996.
[23] Daniel Gopher,et al. Workload: An examination of the concept. , 1986 .
[24] Ian Oakley,et al. Solving multi-target haptic problems in menu interaction , 2001, CHI Extended Abstracts.
[25] Mark S. Sanders,et al. Human factors in engineering and design, 7th ed. , 1993 .
[26] J. C. Byers,et al. Comparison of Four Subjective Workload Rating Scales , 1992 .
[27] S. Hart,et al. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .
[28] P. S. Tsang,et al. TECHNIQUES OF SUBJECTIVE WORKLOAD ASSESSMENT: A COMPARISON OF SWAT AND THE NASA-BIPOLAR METHODS , 1986 .
[29] Stephen A. Brewster,et al. Using nonspeech sounds to provide navigation cues , 1998, TCHI.
[30] Motoyuki Akamatsu,et al. A multi-modal mouse with tactile and force feedback , 1994, Int. J. Hum. Comput. Stud..
[31] Meera Blattner,et al. Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..
[32] A BrewsterStephen. Using nonspeech sounds to provide navigation cues , 1998 .
[33] F. L. Engel,et al. Improved efficiency through I- and E-feedback: a trackball with contextual force feedback , 1994, Int. J. Hum. Comput. Stud..
[34] Stephen A. Brewster,et al. Correcting menu usability problems with sound , 1999, Behav. Inf. Technol..
[35] Jin Liu,et al. Three-dimensional PC: toward novel forms of human-computer interaction , 2001, Optics East.
[36] Sharon L. Oviatt,et al. Designing the User Interface for Multimodal Speech and Pen-Based Gesture Applications: State-of-the-Art Systems and Future Research Directions , 2000, Hum. Comput. Interact..
[37] Wen Gao,et al. A Parallel Multistream Model for Integration of Sign Language Recognition and Lip Motion , 2000, ICMI.
[38] Jonathan Grudin,et al. Design and evaluation , 1995 .
[39] Colin Ware,et al. Eye-hand co-ordination with force feedback , 2000, CHI.
[40] Christopher A. Miller,et al. User acceptance of an intelligent user interface: a Rotorcraft Pilot's Associate example , 1998, IUI '99.
[41] Liang Chen,et al. QuickSet: Multimodal Interaction for Simulation Set-up and Control , 1997, ANLP.
[42] Richard A. Bolt,et al. “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.
[43] Guozhong Dai,et al. An Experimental Study of Input Modes for Multimodal Human-Computer Interaction , 2000, ICMI.
[44] Patrick W. Demasco,et al. Multimodal input for computer access and augmentative communication , 1996, Assets '96.
[45] Shumin Zhai,et al. What You Feel Must Be What You See: Adding Tactile Feedback to the Trackpoint , 1999, INTERACT.
[46] Gavriel Salvendy,et al. Prediction of Mental Workload in Single and Multiple Tasks Environments , 2000 .
[47] Anthony J. Aretz,et al. An Empirical Validation of Subjective Workload Ratings , 1996 .