The dynamic nature of the Web poses problems for usability evaluations. Development times are rapid and changes to Web sites occur frequently, often without a chance to re-evaluate the usability of the entire site. New advances in Web developments change user expectations. In order to incorporate usability evaluations into such an environment, we must produce methods that are compatible with the development constraints. We believe that rapid, remote, and automated evaluation techniques are key to ensuring usable Web sites. In this paper, we describe three studies we carried out to explore the feasibility of using modified usability testing methods or non-traditional methods of obtaining information about usability to satisfy our criteria of rapid, remote, and automated evaluation. Based on lessons learned in these case studies, we are developing tools for rapid, remote, and automated usability evaluations. Our future work includes using these tools on a variety of Web sites to determine 1) their effectiveness compared to traditional evaluation methods, 2) the optimal types of sites and stages of development for each tool, and 3) tool enhancements.
[1]
Jakob Nielsen,et al.
Usability engineering at a discount
,
1989
.
[2]
Rick Stout.
Web Site Stats: Tracking Hits and Analyzing Web Traffic
,
1996
.
[3]
Jakob Nielsen,et al.
Usability engineering
,
1997,
The Computer Science and Engineering Handbook.
[4]
Bonnie E. John,et al.
Tracking the effectiveness of usability evaluation methods
,
1997,
Behav. Inf. Technol..
[5]
John T. Kelso,et al.
Remote evaluation: the network as an extension of the usability laboratory
,
1996,
CHI.
[6]
Robin Jeffries,et al.
User interface evaluation in the real world: a comparison of four techniques
,
1991,
CHI.
[7]
David K. Gifford,et al.
Remote evaluation
,
1990,
TOPL.