Ebook Showdown: Evaluating Academic Ebook Platforms from a User Perspective

Introduction Across all types of libraries in the United States, ebook usage and acquisition continues to rise. According to a 2012 Library Journal study, ebook holdings in academic libraries went up an average of 41% between 2011 and 2012. Of the 339 academic libraries surveyed, 95% reported offering ebooks as part of their regular collection.1 Libraries interested in increasing ebook offerings face an overwhelming variety of publisher and aggregator platform choices, package options, and cost models that must be considered in conjunction with discipline and user preferences. However, for many academic librarians questions of the usability and accessibility of digital formats for students are foremost, casting doubt on the viability of replacing print books with ebooks. Giving all students the independence to read and research on their own is vital to a quality education. “It is the right thing to do, the smart thing to do, and it is the law.”2 As library collections move online, it is essential that publishers offer the features necessary to make them as usable as print titles, as well as accessible to those students whose physical or cognitive disabilities make print books an unworkable option. There are several reviews in the literature on the usability and accessibility of ebook readers—such as Kobo and Kindle—but few systematic analyses of the software platforms that support academic ebooks purchased by university and college libraries. The books on these platforms are accessed via the Internet using each publisher’s proprietary interface and the user’s experience can be different for each publisher. As part of San Jose State University’s (SJSU) Ebook Accessibility Project (EAP), 16 major academic ebook platforms were evaluated with the goal of allowing students and librarians to make more informed decisions about which platforms are most accessible and user friendly to students, particularly those with disabilities. This paper discusses our findings and offers a summary of our results. (Note: These platform evaluations were performed in June-August 2014. Newer versions of the platforms with additional features may have been implemented after this time.)