Multimodality in Language and Speech Systems
暂无分享,去创建一个
[1] Alistair D. N. Edwards,et al. Speech input for persons with speech impairments , 1995 .
[2] Philip J. Barnard,et al. Interactions with Advanced Graphical Interfaces and the Deployment of Latent Human Knowledge , 1994, DSV-IS.
[3] Peter Gregor,et al. Intelligent system for speech and language impaired people: a portfolio of research , 1995 .
[4] Alistair D. N. Edwards,et al. A Principled Design Methodology for Auditory Interaction , 1999, INTERACT.
[5] Richard Bowden. Progress in Sign and Gesture Recognition , 2004, AMDO.
[6] Marion B Cothren. This is the moon , 1946 .
[7] Jiajie Zhang,et al. A representational analysis of relational information displays , 1996, Int. J. Hum. Comput. Stud..
[8] Roger B. Dannenberg,et al. Multimedia interface design , 1992 .
[9] Dimitrios Rigas,et al. Guidelines for auditory interface design : an empirical investigation , 1996 .
[10] Alistair D. N. Edwards,et al. Mathematical Representations: Graphs, Curves and Formulas , 1993 .
[11] Paul Blenkhorn,et al. A Method of Access to Computer Aided Software Engineering (CASE) Tools for Blind Software Engineers , 1994, ICCHP.
[12] K. Patterson,et al. Speak and spell: Dissociations and word-class effects. , 1987 .
[13] James L. Alty,et al. The Use of Music in a Graphical Interface for the Visually Impaired , 1997, INTERACT.
[14] Ian Oakley,et al. Putting the feel in ’look and feel‘ , 2000, CHI.
[15] Elizabeth D. Mynatt,et al. Nonvisual presentation of graphical user interfaces: contrasting two approaches , 1994, CHI Conference Companion.
[16] Robert David Stevens,et al. Principles for the Design of Auditory Interfaces to Present Complex Information to Blind People , 1996 .
[17] Alistair D. N. Edwards. Speech Synthesis: Technology for Disabled People , 1991 .
[18] Andrea R. Kennel. Audiograf: a diagram-reader for the blind , 1996, Assets '96.
[19] Kenneth I. Joy,et al. Sound graphs: A numerical data analysis method for the blind , 1985, Journal of Medical Systems.
[20] Gary W. Strong,et al. An evaluation of the PRC Touch Talker with Minspeak: some lessons for speech prosthesis design , 1995 .
[21] Martin Kurze,et al. TDraw: a computer-based tactile drawing tool for blind people , 1996, Assets '96.
[22] Jeff Shrager,et al. Modeling and analysis of dyslexic writing using speech and other modalities , 1995 .
[23] Stephen Brewster,et al. Providing a structured method for integrating Non-Speech Audio into HCI , 1994 .
[24] Stephen Brewster,et al. Experimentally Derived Guidelines for the Creation of Earcons , 2001 .
[25] Grigori Evreinov,et al. Alternative textured display , 1998 .
[26] J. Terry Mayes,et al. The ‘M-Word’: Multimedia Interfaces and Their Role in Interactive Learning Systems , 1992 .
[27] Ben P. Challis. Design principles for tactile communication within the human-computer interface , 2001 .
[28] Sidney L. Smith,et al. DESIGN GUIDELINES FOR USER-SYSTEM INTERFACE SOFTWARE , 1984 .
[29] Terri L. Bonebright,et al. Testing the effectiveness of sonified graphs for education: A programmatic research project , 2001 .
[30] Alistair D. N. Edwards,et al. Exploration of non-seen diagrams , 1998 .
[31] Craig A. Will,et al. Review of Virtual Environment Interface Technology. , 1996 .
[32] Evangelos Nikolaos Mitsopoulos. A principled approach to the design of auditory interaction in the non-visual user interface , 2000 .
[33] Ben P. Challis,et al. Design Principles for Tactile Interaction , 2000, Haptic Human-Computer Interaction.
[34] Alistair D. N. Edwards. Progress in Sign Languages Recognition , 1997, Gesture Workshop.
[35] Meera Blattner,et al. Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..
[36] Alistair D. N. Edwards. The rise of the graphical user interface , 1995 .
[37] Frances Aldrich,et al. Tactile graphics in school education: perspectives from pupils , 2001 .