Automated Accessibility Evaluation Software for Authenticated Environments - A Heuristic Usability Evaluation

Web accessibility has been the subject of much discussion regarding the need to make Web content accessible to all people, regardless of their abilities or disabilities. While some testing techniques require human intervention, accessibility can also be evaluated by automated tools. Automated evaluation tools are software programs that examine the code of Web pages to determine if they conform to a set of accessibility guidelines that are often based on the Web Content Accessibility Guidelines Version 2.0 (WCAG 2.0), developed by the World Wide Web Consortium (W3C). In this context, the purpose of this study is to analyze an automated software program for evaluating authenticated environments and verify the usability of this tool, since automated systems require precision and reliability in terms of both results and use in any type of environment. With this in mind, this paper aimed at evaluating the ASES software by means of a heuristic evaluation carried out by three experts. The analysis revealed major accessibility problems, as well as improper functioning of available tools and inconsistency of results. Furthermore, ASES was found to have problems of efficiency, interaction, validity, and reliability in the results presented. Considering that this is an open-source accessibility testing tool that can be found on a government web site, the correction or improvement of the system’s deficiencies identified in this study is highly recommended, as there is a lack of software available to evaluate authenticated environments.