Design foundations for content-rich acoustic interfaces: investigating audemes as referential non-speech audio cues
暂无分享,去创建一个
[1] Robert C. Morrison,et al. A Microcomputer-Based Laboratory Aid for Visually Impaired Students , 1983, IEEE Micro.
[2] A. Baddeley. Essentials of Human Memory , 1999 .
[3] Terri L. Bonebright,et al. Sonific ation Report: Status of the Field and Research Agenda , 2010 .
[4] Paul A Lucas,et al. An evaluation of the communicative ability of auditory icons and earcons , 1994 .
[5] Mexhid Ferati,et al. Usability evaluation of acoustic interfaces for the blind , 2011, SIGDOC '11.
[6] R. Tytler,et al. Opening up pathways : engagement in STEM across the primary-secondary school transition , 2008 .
[7] Antti Pirhonen,et al. NON-SPEECH SOUNDS AS ELEMENTS OF A USE SCENARIO: A SEMIOTIC PERSPECTIVE , 2006 .
[8] Jaime Sánchez,et al. Mobile Messenger for the Blind , 2006, Universal Access in Ambient Intelligence Environments.
[9] Petr Janata,et al. Marketbuzz: Sonification of Real-Time Financial Data , 2004, ICAD.
[10] Catherine Guastavino,et al. USABILITY OF NON-SPEECH SOUNDS IN USER INTERFACES , 2008 .
[11] Jacob O. Wobbrock,et al. Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.
[12] Petr Janata,et al. Marketbuzz: Sonification of real-time financial dataa , 2004 .
[13] Mikael Fernström,et al. After direct manipulation---direct sonification , 1998, TAP.
[14] Jaime Sánchez,et al. AudioMath: Blind children learning mathematics through audio , 2005 .
[15] Patrick Baudisch,et al. Blindsight: eyes-free access to mobile phones , 2008, CHI.
[16] E. Klima. The signs of language , 1979 .
[17] Eli Hagen,et al. Towards an American Sign Language interface , 1994, Artificial Intelligence Review.
[18] A. Glenberg,et al. Comprehension of illustrated text: Pictures help to build mental models☆ , 1992 .
[19] C. Y. Nolan,et al. The Comprehension of Rapid Speech by the Blind , 1962 .
[20] R. Mayer,et al. Nine Ways to Reduce Cognitive Load in Multimedia Learning , 2003 .
[21] Jaime Sánchez,et al. Interactive virtual acoustic environments for blind children: computing, usability, and cognition , 2001, CHI Extended Abstracts.
[22] Maria Klara Wolters,et al. Name that tune: musicons as reminders in the home , 2011, CHI.
[23] Mohammad Suyanto. Final Cut Pro , 2010 .
[24] Michel Chion,et al. Audio-Vision: Sound on Screen , 1994 .
[25] Paolo Paolini,et al. Designing aural information architectures , 2006, SIGDOC '06.
[26] Albert S. Bregman,et al. The Auditory Scene. (Book Reviews: Auditory Scene Analysis. The Perceptual Organization of Sound.) , 1990 .
[27] Tohru Ifukube,et al. Maximum listening speeds for the blind , 2003 .
[28] Jean-François Rouet,et al. Effects of Online Reading on Popular Science Comprehension , 2003 .
[29] Stephen A. Brewster,et al. Parallel earcons: reducing the length of audio messages , 1995, Int. J. Hum. Comput. Stud..
[30] Alistair D. N. Edwards,et al. Soundtrack: An Auditory Interface for Blind Users (Abstract Only) , 1989, SGCH.
[31] Bruce N. Walker,et al. Mappings and metaphors in auditory displays: An experimental assessment , 2005, TAP.
[32] Stéphane Conversy. Ad-hoc synthesis of auditory icons , 1998 .
[33] Marco Fabiani,et al. Interactive sonification of emotionally expressive gestures by means of music performance , 2010 .
[34] Hideki Koike,et al. EdgeSonic: image feature sonification for the visually impaired , 2011, AH '11.
[35] Sara Ann Bly. Sound and computer information presentation , 1982 .
[36] Duncan P. Brumby,et al. Fast or safe?: how performance objectives determine modality output choices while interacting on the move , 2011, CHI.
[37] John G. Neuhoff,et al. Sonification Report: Status of the Field and Research Agenda Prepared for the National Science Foundation by members of the International Community for Auditory Display , 1999 .
[38] Kenneth R. Lord,et al. Exploring the dimensionality of the need for cognition scale , 2006 .
[39] Stephen A. Brewster,et al. The design of sonically-enhanced widgets , 1998, Interact. Comput..
[40] Mexhid Ferati,et al. Audemes at work: Investigating features of non-speech sounds to maximize content recognition , 2012, Int. J. Hum. Comput. Stud..
[41] Mexhid Ferati,et al. Back navigation shortcuts for screen reader users , 2012, ASSETS '12.
[42] Stephen A. Brewster,et al. Investigating touchscreen accessibility for people with visual impairments , 2008, NordiCHI.
[43] William W. Gaver. Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..
[44] Stephen Brewster,et al. Providing a Structured Method for Integrating Non-Speech Audio into Human-Computer Interfaces , 1994 .
[45] Stephen Brewster,et al. Using non-speech sound to overcome information overload , 1997 .
[46] David Sonnenschein,et al. Sound Design: The Expressive Power of Music, Voice and Sound Effects in Cinema , 2001 .
[47] Ephraim P. Glinert,et al. Multimodal Integration , 1996, IEEE Multim..
[48] Sarah Guri-Rozenblit,et al. The interrelations between diagrammatic representations and verbal explanations in learning from social science texts , 1988 .
[49] T. Brock,et al. Journal of Personality and Social Psychology the Role of Transportation in the Persuasiveness of Public Narratives Text Quality Individual Differences and Situational Influences Transportation Scale Items Gender Differences Discriminant Validation: Need for Cognition Effect of Text Manipulation Beli , 2022 .
[50] Helen Petrie,et al. Auditory navigation in hyperspace: design and evaluation of a non-visual hypermedia system for blind users , 1998, Assets '98.
[51] Mexhid Ferati,et al. Educational Sound Symbols for the Visually Impaired , 2009, HCI.
[52] Stephen Brewster,et al. Providing an audio glance at algebra for blind readers , 1994 .
[53] Mark S. Sanders,et al. Human Factors in Engineering and Design , 2016 .
[54] Stephen A. Brewster,et al. Earcons as a Method of Providing Navigational Cues in a Menu Hierarchy , 1996, BCS HCI.
[55] G. Bower. Mood and memory. , 1981, The American psychologist.
[56] R. Mayer,et al. A Split-Attention Effect in Multimedia Learning: Evidence for Dual Processing Systems in Working Memory , 1998 .
[57] Jaime H. Sanchez,et al. Independent Outdoor Mobility for the Blind , 2007, 2007 Virtual Rehabilitation.
[58] Robert M. Bernard. Using extended captions to improve learning from instructional illustrations , 1990, Br. J. Educ. Technol..
[59] Bruce N. Walker,et al. LEARNING RATES FOR AUDITORY MENUS ENHANCED WITH SPEARCONS VERSUS EARCONS , 2007 .
[60] Stephen Brewster,et al. A SURVEY OF AUDIO-RELATED KNOWLEDGE AMONGST SOFTWARE ENGINEERS DEVELOPING HUMAN-COMPUTER INTERFACES , 2001 .
[61] Marilyn McGee-Lennon,et al. Audio reminders in the home environment , 2007 .
[62] Stephen Brewster,et al. Experimentally Derived Guidelines for the Creation of Earcons , 2001 .
[63] Gregory Kramer,et al. Auditory Display: Sonification, Audification, And Auditory Interfaces , 1994 .
[64] P. F. Adams,et al. Current estimates from the National Health Interview Survey, 1994. , 1995, Vital and health statistics. Series 10, Data from the National Health Survey.
[65] Mexhid Ferati,et al. Acoustic interaction design through "audemes": experiences with the blind , 2009, SIGDOC '09.
[66] A. D. N. Edwards,et al. Weasel: a computer based system for providing non-visual access to music notation , 2000, SIGC.
[67] J. Trouvain,et al. COMPREHENSION OF ULTRA-FAST SPEECH - BLIND VS. "NORMALLY HEARING" PERSONS , 2007 .
[68] Schloss Birlinghoven,et al. USING AUDIFICATION IN PLANETARY SEISMOLOGY , 2001 .
[69] Robert David Stevens,et al. Principles for the Design of Auditory Interfaces to Present Complex Information to Blind People , 1996 .
[70] Catherine Plaisant,et al. Non-visual exploration of geographic maps: Does sonification help? , 2010, Disability and rehabilitation. Assistive technology.
[71] Stefano Ceri,et al. Web Modeling Language (WebML): a modeling language for designing Web sites , 2000, Comput. Networks.
[72] Jaime Sánchez,et al. Memory enhancement through audio , 2004, ACM SIGACCESS Access. Comput..
[73] M. McDaniel,et al. Illustrations as adjuncts to prose: a text-appropriate processing approach , 1988 .
[74] Eoin Brazil,et al. Human-Computer Interaction Design based on Interactive Sonification - Hearing Actions or Instruments/Agents. , 2004 .
[75] Sook Young Won. Auditory display of genome data: Human chromosome 21 , 2005 .
[76] Manne-Sakari Mustonen,et al. A review-based conceptual analysis of auditory signs and their design , 2008 .
[77] Maribeth Back,et al. Micro-narratives in sound design: Context, character, and caricature in waveform manipulation , 1996 .
[78] A. Paivio,et al. Dual coding theory and education , 1991 .
[79] Stephen M. Kosslyn,et al. Elements of graph design , 1993 .
[81] Meera Blattner,et al. Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..
[82] Janan Al-Awar Smither. Short term memory demands in processing synthetic speech by old and young adults , 1993, Behav. Inf. Technol..
[83] Kenneth I. Joy,et al. Sound graphs: A numerical data analysis method for the blind , 1985, Journal of Medical Systems.
[84] Stephen A. Brewster,et al. Maximising screen-space on mobile computing devices , 1999, CHI Extended Abstracts.
[85] D. S. Brungart. Control of perceived distance in virtual audio displays , 1998, Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Vol.20 Biomedical Engineering Towards the Year 2000 and Beyond (Cat. No.98CH36286).
[86] Yvonne Eriksson,et al. Computer games for children with visual impairments , 2005 .
[87] C. Schoenborn,et al. Current estimates from the National Health Interview Survey. , 1988, Vital and health statistics. Series 10, Data from the National Health Survey.
[88] Mexhid Ferati,et al. Using Audemes as a Learning Medium for the Visually Impaired , 2009, HEALTHINF.
[89] William W. Gaver,et al. Effective sounds in complex systems: the ARKOLA simulation , 1991, CHI.
[90] E. Diener,et al. Sex differences in the recall of affective experiences. , 1998, Journal of personality and social psychology.
[91] Mexhid Ferati,et al. Assessing the effectiveness of distributed pair programming for an online informatics curriculum , 2010, INROADS.
[92] Mexhid Ferati,et al. Aural browsing on-the-go: listening-based back navigation in large web architectures , 2012, CHI.
[93] J. Mezrich,et al. Dynamic Representation of Multivariate Time Series Data , 1984 .
[94] Bruce N. Walker,et al. Advanced auditory menus: design and evaluation of auditory scroll bars , 2008, Assets '08.
[95] Roman Vilimek,et al. Effects of Speech and Non-Speech Sounds on Short-Term Memory and Possible Implications for In-Vehicle Use Research paper for the ICAD05 workshop "Combining Speech and Sound in the User Interface" , 2005 .
[96] M. Lassonde,et al. Blind subjects process auditory spectral cues more efficiently than sighted individuals , 2004, Experimental Brain Research.
[97] B. Shinn-Cunningham. Applications of virtual auditory displays , 1998, Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Vol.20 Biomedical Engineering Towards the Year 2000 and Beyond (Cat. No.98CH36286).
[98] Antti Pirhonen,et al. Same sound – Different meanings : A Novel Scheme for Modes of Listening , 2010 .
[99] Gustavo Rossi,et al. An Object Oriented Approach to Web-Based Applications Design , 1998, Theory Pract. Object Syst..
[100] Lisa Feldman Barrett,et al. Sex Differences in Emotional Awareness , 2000 .
[101] Stephen Brewster,et al. Designing non-speech sounds to support navigation in mobile phone menus , 2000 .
[102] Rebecca L. Oxford,et al. Instructional Implications of Gender Differences in Second/Foreign Language (L2) Learning Styles and Strategies. , 1993 .
[103] Tilman Dingler,et al. Learnabiltiy of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech , 2008 .
[104] Krzysztof Z. Gajos,et al. Ability-Based Design: Concept, Principles and Examples , 2011, TACC.
[105] Stephen Brewster,et al. A Detailed Investigation into the Effectiveness of Earcons , 1997 .
[106] Barry H. Kantowitz,et al. Human Factors: Understanding People-System Relationships , 1983 .
[107] J. Laird,et al. Remembering What You Feel: Effects of Emotion on Memory , 1982 .
[108] Douglas Turnbull,et al. Modeling the Semantics of Sound , 2007 .
[109] Stephen A. Brewster,et al. An evaluation of earcons for use in auditory human-computer interfaces , 1993, INTERCHI.
[110] Mexhid Ferati,et al. Towards a Modeling Language for Designing Auditory Interfaces , 2009, HCI.
[111] Hermann Schmitt,et al. Dynamic Representation , 1995, American Political Science Review.
[112] Gregg C. Vanderheiden,et al. Web content accessibility guidelines 1.0 , 2001, INTR.
[113] William W. Gaver. The SonicFinder: An Interface That Uses Auditory Icons , 1989, Hum. Comput. Interact..
[114] Myounghoon Jeon,et al. “Spindex”: Accelerated Initial Speech Sounds Improve Navigation Performance in Auditory Menus , 2009 .
[115] Gregory D. Abowd,et al. No-Look Notes: Accessible Eyes-Free Multi-touch Text Entry , 2010, Pervasive.
[116] S. McAdams,et al. Auditory Cognition. (Book Reviews: Thinking in Sound. The Cognitive Psychology of Human Audition.) , 1993 .
[117] Christopher Frauenberger,et al. Interaction patterns for auditory user interfaces , 2005 .
[118] Bruce N. Walker,et al. SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .
[119] Phyllis Levine,et al. After High School: A First Look at the Postschool Experiences of Youth with Disabilities. A Report from the National Longitudinal Transition Study-2 (NLTS2). , 2005 .
[120] Terri L. Bonebright,et al. Testing the effectiveness of sonified graphs for education: A programmatic research project , 2001 .