Design-Time Web Usability Evaluation with Guideliner

The diversity of smartphones and tablet computers has become an integral part of modern life. An essential requirement for web application development is following web usability guidelines, while designing web user interface (UI). Even a minor change in UI could lead to usability problems. Empirical evaluation methods like interviews and questionnaires with user-tests and card sorting are effective in finding such problems. Nevertheless, there are multiple obstacles preventing the application of these methods especially for evaluating minor UI changes, for instance, due to the time and human-resources they require, and the amount of data to be processed. The purpose of this current publication is to present Guideliner – a tool for implementation-time automatic evaluation of web UI conformance to predefined usability guidelines. The main contribution of the presented solution is enabling immediate cost-efficient and automated web UI evaluation that conforms to available and set standards.

[1]  Barbara Leporini,et al.  Flexible tool support for accessibility evaluation , 2006, Interact. Comput..

[2]  Alexiei Dingli,et al.  An Intelligent Framework for Website Usability , 2014, Adv. Hum. Comput. Interact..

[3]  Ahto Kalja,et al.  A Framework for Improving Web Application User Interfaces Through Immediate Evaluation , 2016, DB&IS.

[4]  Fabio Paternò,et al.  An extensible environment for guideline-based accessibility evaluation of dynamic Web applications , 2015, Universal Access in the Information Society.

[5]  Liyang Yu,et al.  A Developer’s Guide to the Semantic Web , 2011, Springer Berlin Heidelberg.

[6]  Ian Horrocks,et al.  FaCT++ Description Logic Reasoner: System Description , 2006, IJCAR.

[7]  Frank M. Shipman,et al.  Learning usability assessment models for web sites , 2011, IUI '11.

[8]  Alexiei Dingli,et al.  USEFul: A Framework to Mainstream Web Site Usability Through Automated Evaluation , 2012 .

[9]  Dirk Beyer,et al.  Status Report on Software Verification - (Competition Summary SV-COMP 2014) , 2014, TACAS.

[10]  Thomas R. Gruber,et al.  Toward principles for the design of ontologies used for knowledge sharing? , 1995, Int. J. Hum. Comput. Stud..

[11]  Judy van Biljon,et al.  Usability evaluation methods: mind the gaps , 2009, SAICSIT '09.

[12]  J. V. Gurp,et al.  Separation of Concerns : A Case Study , 2001 .

[13]  Marco Winckler,et al.  Usability remote evaluation for WWW , 2000, CHI Extended Abstracts.

[14]  Ahto Kalja,et al.  A Tool for Design-Time Usability Evaluation of Web User Interfaces , 2017, ADBIS.

[15]  Anders Bruun,et al.  Is usability evaluation important: the perspective of novice software developers , 2013, BCS HCI.

[16]  Greg Gay,et al.  AChecker: open, interactive, customizable, web accessibility checking , 2010, W4A.

[17]  Jan Stage,et al.  Obstacles to usability evaluation in practice: a survey of software development organizations , 2008, NordiCHI.

[18]  Marti A. Hearst,et al.  The state of the art in automating usability evaluation of user interfaces , 2001, CSUR.

[19]  Marco Winckler,et al.  Towards an Ontology-Based Approach for Dealing with Web Guidelines , 2008, WISE Workshops.

[20]  Sean Bechhofer,et al.  The OWL API: A Java API for OWL ontologies , 2011, Semantic Web.

[21]  Marc Roper,et al.  What's in a bug report? , 2014, ESEM '14.

[22]  Ahto Kalja,et al.  Ontology Design for Automatic Evaluation of Web User Interface Usability , 2017, 2017 Portland International Conference on Management of Engineering and Technology (PICMET).

[23]  John C. Grundy,et al.  Reporting usability defects: do reporters report what software developers need? , 2016, EASE.

[24]  Daniela Godoy,et al.  An approach for knowledge discovery in a web usability context , 2014, IHC.

[25]  John Grundy,et al.  Reporting Usability Defects: Limitations of Open Source Defect Repositories and Suggestions for Improvement , 2015, ASWEC.