Investigating the effects of sound masking on the use of audio CAPTCHAs

The SoundsRight Audio Completely Automated Public Turing tests to tell Computers and Humans Apart (CAPTCHA) was developed with the goal of providing a usable and secure audio CAPTCHA for people with visual impairments. Its design requires users to repeatedly identify a specific sound from a group of different sounds (e.g. baby crying and bird chirping) in real time. Adding background noise (sound masks) to the sounds may make it more difficult for automated software to recognise the sounds and therefore, improve security. However, the sound masks may also make it more challenging for human users to recognise the sound. We conducted a user study involving 20 blind participants and 20 sighted participants to investigate the effect of sound masks on the usability of the SoundsRight CAPTCHA. The results suggest that sound masks do have a significant impact on the failure rate and response time. Sighted participants had significantly a higher failure rate than blind participants and were more vulnerable to the negative effect of sound masks.

[1]  Investigating security. , 1981, Nursing homes.

[2]  Anjali Avinash Chandavale,et al.  An Improved Adaptive Noise Reduction for Secured CAPTCHA , 2011, 2011 Fourth International Conference on Emerging Trends in Engineering & Technology.

[3]  Sajad Shirali-Shahreza,et al.  Accessibility of CAPTCHA methods , 2011, AISec '11.

[4]  Harry Hochheiser,et al.  Towards A Universally Usable Human Interaction Proof: Evaluation of Task Completion Strategies , 2010, TACC.

[5]  John Langford,et al.  Telling humans and computers apart automatically , 2004, CACM.

[6]  Mary Czerwinski,et al.  Designing human friendly human interaction proofs (HIPs) , 2005, CHI.

[7]  Artemios G. Voyiatzis,et al.  On the necessity of user-friendly CAPTCHA , 2011, CHI.

[8]  Harry Hochheiser,et al.  Accessible privacy and security: a universally usable human-interaction proof tool , 2010, Universal Access in the Information Society.

[9]  Aaron Allen,et al.  What Frustrates Screen Reader Users on the Web: A Study of 100 Blind Users , 2007, Int. J. Hum. Comput. Interact..

[10]  Shujun Li,et al.  Breaking e-banking CAPTCHAs , 2010, ACSAC '10.

[11]  Jonathan Lazar,et al.  The SoundsRight CAPTCHA: an improved approach to audio human interaction proofs for blind users , 2012, CHI.

[12]  Patrice Y. Simard,et al.  Using Machine Learning to Break Visual Human Interaction Proofs (HIPs) , 2004, NIPS.

[13]  Jeff Yan,et al.  Usability of CAPTCHAs or usability issues in CAPTCHA design , 2008, SOUPS '08.

[14]  S. Gelfand Essentials of Audiology , 1997 .

[15]  Jeffrey P. Bigham,et al.  Evaluating existing audio CAPTCHAs and an interface optimized for non-visual use , 2009, CHI.

[16]  Mary Czerwinski,et al.  Computers beat Humans at Single Character Recognition in Reading based Human Interaction Proofs (HIPs) , 2005, CEAS.

[17]  Josh Beggs,et al.  Designing Web Audio , 2001 .

[18]  Jitendra Malik,et al.  Recognizing objects in adversarial clutter: breaking a visual CAPTCHA , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[19]  Jonathan Lazar,et al.  Investigating the Security-related Challenges of Blind Users on the Web , 2008 .

[20]  Luis von Ahn,et al.  Breaking Audio CAPTCHAs , 2008, NIPS.

[21]  Andreas Krause,et al.  Advances in Neural Information Processing Systems (NIPS) , 2014 .

[22]  John C. Mitchell,et al.  The Failure of Noise-Based Non-continuous Audio Captchas , 2011, 2011 IEEE Symposium on Security and Privacy.