Clique: a conversant, task-based audio display for GUI applications

The purpose of the Clique project is to explore a new way of adapting applications with graphical user interfaces (GUIs) for use in audio. Existing adaptation methods retain the components and metaphors of visual interfaces in the audio displays they produce. Clique, on the other hand, presents the user with a conversational audio display based on the tasks supported by programs, not their visual representations. The user interacts solely with this audio display while Clique takes charge of inspecting and controlling the underlying programs via their GUIs. In effect, the graphical nature of program interfaces is hidden from the listening user who is free to concentrate on his or her tasks in audio. We hypothesize that audio displays produced in this manner will prove more effective and satisfying for common tasks than current solutions.

[1]  J. Cassell,et al.  Towards a model of technology and literacy development: Story listening systems , 2004 .

[2]  Daniel B. Legoff,et al.  Use of LEGO© as a Therapeutic Medium for Improving Social Competence , 2004, Journal of autism and developmental disorders.

[3]  I. Scott MacKenzie,et al.  Text Entry for Mobile Computing: Models and Methods,Theory and Practice , 2002, Hum. Comput. Interact..

[4]  L. Adams,et al.  Social Story Intervention , 2004 .

[5]  Peter Gregor,et al.  An empirical investigation of ways in which some of the problems encountered by some dyslexics may be alleviated using computer techniques , 2000, Assets '00.

[6]  Shari Trewin,et al.  A study of input device manipulation difficulties , 1996, Assets '96.

[7]  Mark A. Neerincx,et al.  Usability trade-offs for adaptive user interfaces: ease of use and learnability , 2004, IUI '04.

[8]  Brad A. Myers,et al.  Trackball text entry for people with motor impairments , 2006, CHI.

[9]  A. Streri,et al.  Touching for knowing : cognitive psychology of haptic manual perception , 2003 .

[10]  Peter Robinson,et al.  Investigating the applicability of user models for motion-impaired users , 2000, Assets '00.

[11]  Elizabeth D. Mynatt Transforming Graphical Interfaces Into Auditory Interfaces for Blind Users , 1997, Hum. Comput. Interact..

[12]  M Scherer Chapter 1 – Matching Consumers with Appropriate Assistive Technologies , 2002 .

[13]  Justine Cassell,et al.  Virtual peers as partners in storytelling and literacy learning , 2003, J. Comput. Assist. Learn..

[14]  Roope Raisamo,et al.  Device independent text input: a rationale and an example , 2000, AVI '00.

[15]  Andrew Sears,et al.  Physical disabilities and computing technologies: an analysis of impairments , 2002 .

[16]  S. S. Brown,et al.  Conversion of notations , 2004 .

[17]  Hao Yan,et al.  Shared reality: physical collaboration with a virtual peer , 2000, CHI Extended Abstracts.

[18]  David Wood,et al.  The effect of task conditions on the comprehensibility of synthetic speech , 2000, CHI.

[19]  Marti L. Riemer-Reiss,et al.  Factors Associated with Assistive Technology Discontinuance among Individuals with Disabilities , 2000 .

[20]  David D. Woods,et al.  How Experienced Users Avoid Getting Lost in Large Display Networks , 1999, Int. J. Hum. Comput. Interact..

[21]  Sarah Parsons,et al.  The Use and Understanding of Virtual Environments by Adolescents with Autistic Spectrum Disorders , 2004, Journal of autism and developmental disorders.

[22]  Jeffrey Nichols,et al.  Using handhelds to help people with motor impairments , 2002, Assets '02.

[23]  Ted L. Wattenberg Beyond legal compliance: Communities of advocacy that support accessible online learning , 2004, Internet High. Educ..

[24]  Chris Schmandt Chatter: A Conversational Learning Speech Interface , 1994 .

[25]  M. Goldberg,et al.  Formalizing cognitive and motor strategy of haptic exploratory movements of individuals who are blind , 2004, Proceedings. Second International Conference on Creating, Connecting and Collaborating through Computing.

[26]  Vimla L. Patel,et al.  Usability in the real world: assessing medical information technologies in patients' homes , 2003, J. Biomed. Informatics.

[27]  Leasha M. Barry,et al.  Using Social Stories to Teach Choice and Play Skills to Children With Autism , 2004 .

[28]  A. Edwards Extra-ordinary human-computer interaction: interfaces for users with disabilities , 1995 .

[29]  Marina Umaschi Bers,et al.  Interactive storytelling systems for children: using technology to explore language and identity , 1999 .

[30]  Kitch Barnicle,et al.  Usability testing with screen reading technology in a Windows environment , 2000, CUU '00.

[31]  Brad A. Myers,et al.  Text entry from power wheelchairs: edgewrite for joysticks and touchpads , 2004, Assets '04.

[32]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[33]  Nicole Yankelovich,et al.  SpeechActs: A Spoken-Language Framework , 1996, Computer.

[34]  Vicki L. Hanson The user experience: designs and adaptations , 2003, W4A '04.

[35]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[36]  Barry Arons,et al.  VoiceNotes: a speech interface for a hand-held voice notetaker , 1993, INTERCHI.

[37]  Armando Barreto,et al.  Software-based compensation of visual refractive errors of computer users. , 2005, Biomedical sciences instrumentation.

[38]  Elizabeth D. Mynatt Transforming graphical interfaces into auditory interfaces for blind users , 1997 .

[39]  Kathy Haramundanis Learnability in information design , 2001, SIGDOC '01.

[40]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[41]  M. Sessa,et al.  Background , 2021, Neuroimaging of Covid-19. First Insights based on Clinical Cases.

[42]  Peter Robinson,et al.  Transformation frameworks and their relevance in universal design , 2004, Universal Access in the Information Society.

[43]  C. Campbell Matrix method to find a new set of Zernike coefficients from an original set when the aperture radius is changed. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[44]  Daniel Z. Sands,et al.  A usability study of physicians' interaction with PDA and laptop applications to access an electronic patient record system , 2004, Proceedings. 17th IEEE Symposium on Computer-Based Medical Systems.

[45]  Armando Barreto,et al.  Image pre-compensation to facilitate computer access for users with refractive errors , 2004, Assets '04.

[46]  Julio Abascal,et al.  Universal access to mobile telephony as a way to enhance the autonomy of elderly people , 2001, WUAUC'01.

[47]  M. Alonso,et al.  Improving computer interaction for users with visual acuity deficiencies through inverse point spread function processing , 2005, Proceedings. IEEE SoutheastCon, 2005..

[48]  Shari Trewin,et al.  A model of keyboard configuration requirements , 1998, Assets '98.

[49]  Carole A. Goble,et al.  How much is too much in a hypertext link?: investigating context and preview -- a formative evaluation , 2004, HYPERTEXT '04.

[50]  Christopher Frauenberger,et al.  A Generic, Semantically-Based Design Approach for Spatial Auditory Computer Displays , 2004, ICAD.

[51]  I. Scott MacKenzie,et al.  Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric , 2003, CHI '03.

[52]  Steven Kerr,et al.  Virtual environments for social skills training: the importance of scaffolding in practice , 2002, Assets '02.

[53]  Shari Trewin An invisible keyguard , 2002, Assets '02.