Evaluation Design for Large-Scale, Collaborative Online Archives: Interim Report of the Online Archive of California Evaluation Project

Ongoing evaluation should be a critical aspect of anydigital access project because of the importantinsight it can bring to strategic decision-making andassessment of data integrity. From a systemsdevelopment perspective, iterative design processesdepend upon evaluative feedback from users on issuessuch as interface design and retrieval strategies inorder to refine and enhance prototype and maturesystems. A broader rationale for rigorous andreplicable evaluation is the current absence ofreliable benchmark data drawn from the experiences ofdigital archives projects, especially thoseimplementing Encoded Archival Description, againstwhich other such projects might be assessed. Thispaper reviews considerations that went into the designof a multi-faceted evaluation of the Online Archive ofCalifornia by a team of researchers within theDepartment of Information Studies at the University ofCalifornia, Los Angeles and provides a progress reporton findings. Based upon what they have learned fromconducting this evaluation, the evaluation researchteam will refine the evaluation design and theresearch instruments into models that might be appliedin the collection of benchmark evaluative data byother large-scale, collaborative online archives.