Usability testing at the University of Arizona library : How to let the users in on the design

While the business world has made great strides in focusing on customer service by studying customers' needs and behaviors, libraries have tended to structure their holdings and services around what they believed was good for their customers. It is the "librarians know best" syndrome, and it is pervasive throughout our profession. At the University of Arizona Library, we have been trying to change that model. All the staff are actively working to transform our institution to a user-focused library. We are doing this by asking our users what they need and by listening carefully to what they say. We are continuously monitoring what we are doing and asking what we could be doing better. One important change we are making in our behavior is to evaluate our activities through our customers' eyes and ask ourselves what effect our actions will have on customer service and satisfaction. A combination of factors came together in 1997 to prompt us to work on redesigning SABIO, the library's information gateway (see figure 1). In addition to our library's increased emphasis on user needs assessment, we were concerned about the rapid expansion of Web-based indexes and full-text databases and the obvious frustration of our customers with using the current gateway. A project team, Access 2000, was created to redesign SABIO. The team was charged with including our customers in the redesign process in order to create an end product that would increase their satisfaction with, and success in, finding the information sources they needed via the gateway. [Figure 1 ILLUSTRATION OMITTED] Two years later, we have a unique and highly successful information gateway for our library. We used our customers, primarily our students, to guide us in the design. We believe that we have created a user-centered site. In other words, it is a design that fits the user, rather than one that makes the user fit the design. This article will describe the road we took from the beginning of our project in 1997 to the present. It will outline the different usability evaluation methods we used, how we conducted usability tests, how we analyzed the data, and how we continually redesigned the Web site in response to our user input. Access 2000: The Design Team and Beginnings Access 2000 consisted of five librarians, one systems expert, and a graphic artist. None of the team members had any experience with or significant understanding of user-centered design and usability methodologies when we began. Of necessity, the first order of business was to read and educate ourselves. The writings of Jared Spool, Jakob Nielsen, and Jeffrey Rubin were what we found most helpful.(1) As we read these works, we also began to gather information from our users. We did this in several ways: a user satisfaction survey, five focus groups (one each for faculty, graduate students, and library staff, and two for undergraduates), and an analysis of customer feedback to the library from a variety of sources.(2) We also visited other Web sites, both library and commercial, collecting ideas that we thought we could use. Based on our readings, customers' comments, and review of other library and commercial Web sites, Access 2000 developed a set of design guidelines (see appendix A).(3) Our site design attempted to follow these guidelines. They helped ground us and give us direction. Whenever we ran into a problem, we would go back and review the guidelines. As we look back at the design guidelines, we believe that they were sound and gave us good direction in our subsequent work. (In fact, we are impressed with how astute we were in developing those guidelines so early in our project.) During the usability testing, we often discovered that when parts of the design were not working correctly, it was often because we had not followed one or more of our guidelines. For example, there was a guideline relating to consistency of language between pages. …