Computer-Mediated Peer Review: A Comparison of Calibrated Peer Review and Moodle’s Workshop

Based on our extensive experience with Computer-Mediated Peer Review (CMPR) software in engineering education, we compare Calibrated Peer Review (developed at UCLA) and Moodle’s Workshop. Our examination includes four criteria for each platform’s effectiveness, both for students and for instructors. First, does the software include a cohesive mental model to facilitate a complex task? Second, does the scaffolding elicit / improve reliable student response? Third, are students encouraged to use peer commentary in the learning process? Fourth, does the platform collect and return empirical results that can be used as measures of course learning outcomes? We end with suggestions for improving return-on-investment for instructors and with advice for dealing with student receptivity. 1. Peer Review in Student Learning Peer review has become an important pedagogical strategy in higher education. Based upon a well-known process in academic and business communities, the process of colleagues providing advice to colleagues has migrated to the classroom. Topping provides an overview both of the gains and of the costs of implementing peer review in various discipline-specific classes. Furthermore, research on collaborative learning has established credibility for students in giving feedback to their colleagues. The notion of students helping other students in reading and writing was propagated by early advocates, such as Kenneth Bruffee. Computer-mediated peer review (CMPR) offers advantages on several dimensions: instructor return on investment, convenience of use, uniformity of delivery, data collection, and gains in student learning outcomes. Today, emerging technologies make asynchronous peer assessment more available and engaging. Additionally, significant research studies of fielded CMPRs add to the knowledge base for designing and implementing this category of educational technology. The authors have over a decade of classroom experience using computer-mediated peer review software. Carlson has held three NSF grants to study Calibrated Peer ReviewTM (CPR) in engineering education. She is not, however, a member of the platform’s design / development team. Neither author is a developer for Moodle, although we do run a Moodle website for K-12 STEM educators (http://rose-prism.org ). Neither of the authors has a financial interest in the commercial version of CPR. Elsewhere we have examined how CPR provides evidence for learning outcomes. Here, we present a comparison / contrast between Calibrated Peer Review and Moodle’s activity tool, Workshop. Our observations are informed by the central questions for peer review as a pedagogical device in engineering education suggested by Gehringer in papers given at ASEE and FIE. Both platforms can be used to facilitate peer-review for a range of assignment types: text documents, engineering drawings and visual representations, slide decks to accompany talks, even videos of class presentations. In this review, we focus on written products and their composition process in the context of the writing-as-a-way-of-learning movement. 2. Evolution of Computer-Mediated Peer Review (CMPR) Systems Despite proven benefits, integrating effective peer-review into a course requires much effort. Using computers to facilitate the process of peer-review was a logical progression. Early peer response systems made use of email exchanges among student reviewers. With digital advances in the 1990s, CMPR systems – such as MUCH (Many Using and Creating Hypermedia, 1994) – automated allocation of files for review, stored responses, calculated results, and gave access to peer feedback. Also, Eschenbach exploited web-enabled software to integrate e-assessment in an engineering design course. In the 2000s, increasing use of computers in education, more robust internet connectivity, and advances in peer-to-peer sharing software resulted in improved CMPR systems. Gehringer’s group at the University of North Carolina developed PG, “a portable, Web-based ... system written in Java. . . ” (p. F1B-3). 7 Of note, PG may well have been the first CMPR to include a grade adjustment algorithm that rewarded students for giving quality feedback. Other richly featured systems moved the CMPR paradigm forward. SWoRD (Scaffolded Writing and Rewriting in the Disciplines), developed at the University of Pittsburgh by Christian D. Schunn, added layers of mediation to the basic design and facilitated a richer e-assessment and feedback analysis. Another system, SPARK (and SPARK) – developed at the University of Technology, Sydney, Australia – also moved CMPRs closer to becoming highly interactive learning environment rather than automated systems for overcoming logistical hurtles. SPARK focuses on mediating group-work by including various peerand selfassessment workspaces. Like PG, the grading algorithms in SPARK take into account the quality of feedback contributions and the level of engagement from all members of a team. In summary, today’s CMPR systems make use of the improved technical capacity of webenabled platforms. Thus, many of the burdensome aspects for face-to-face peer review have been alleviated. As we have noted elsewhere, “today’s CMPR systems have the characteristics of what an educational technologist would call a ‘cognitive tool,’ a mental-modeling device aiding learners to enact more powerful strategies for problem-solving than possible without the scaffolding, heuristics, and visualization embedded in the device itself.” 3. Background for UCLA’s Calibrated Peer Review and Moodle’s Workshop CPR is one of the earliest, sophisticate CMPR systems still widely available. 13 (See http://cpr.molsci.ucla.edu/.) Workshop, on the other hand, mirrors a more typical Open Source development history. Originally fielded in 1995, major CPR refinements have been funded through several NSF grants, and the literature contains a rich body of studies on the system’s efficacy. Having gone through six releases, the current version – CPR6 – has some barriers for usage. First, this version is no longer free. Second, the purchasing institution must run the core package locally and absorb the overhead for software / network maintenance and for the storage of student submissions. However, UCLA continues to offer access to a free, older version, which we here designated as CPR1. This release is fully maintained, supported, and periodically updated. Like most legacy software, it contains some idiosyncrasies that require getting used to for a smooth operation. Because of its wide accessibility, we used CPR1 in constructing the current comparison /

[1]  A. Collins,et al.  Situated Cognition and the Culture of Learning , 1989 .

[2]  L. Vygotsky,et al.  Thought and Language , 1963 .

[3]  Elizabeth A. Eschenbach Improving technical writing via web-based peer review of final reports , 2001, 31st Annual Frontiers in Education Conference. Impact on Engineering and Science Education. Conference Proceedings (Cat. No.01CH37193).

[4]  Ming Ming Diao,et al.  ‘I’m not here to learn how to mark someone else’s stuff’: an investigation of an online peer-to-peer review workshop tool , 2015 .

[5]  Patricia Carlson,et al.  Assessing Engineering Design Experiences Using Calibrated Peer Review , 2010 .

[6]  Patricia Carlson,et al.  Improving Engineering Education with Enhanced Calibrated Peer Review Assessment of a Collaborative Research Project , 2012 .

[7]  D. Sluijsmans,et al.  Effective peer assessment processes: Research findings and future directions , 2010 .

[8]  A. Figueira,et al.  Work in progress - W2: an easy-to-use Workshop module , 2009, 2009 39th IEEE Frontiers in Education Conference.

[9]  P. Carlson,et al.  TM AND ASSESSING LEARNING OUTCOMES , 2003 .

[10]  Karen I. Spear Sharing writing : peer response groups in English classes , 1987 .

[11]  Edward F. Gehringer,et al.  Strategies and mechanisms for electronic peer review , 2000, 30th Annual Frontiers in Education Conference. Building on A Century of Progress in Engineering Education. Conference Proceedings (IEEE Cat. No.00CH37135).

[12]  J. Snowball,et al.  Where angels fear to tread: online peer-assessment in a large first-year class , 2013 .

[13]  K. Topping,et al.  Peer Assisted Learning: A Framework for Consultation , 2001 .

[14]  Christian D. Schunn,et al.  Scaffolded writing and rewriting in the discipline: A web-based reciprocal peer review system , 2007, Comput. Educ..

[15]  Edward F. Gehringer Applications for Supporting Collaboration in the Classroom , 2012 .

[16]  Steven W. Floyd,et al.  In the Age of the Smart Machine: The Future of Work and Power. , 1989 .

[17]  P.A. Carlson,et al.  Using Computer-Mediated Peer Review in an Engineering Design Course , 2008, IEEE Transactions on Professional Communication.

[18]  Kenneth A. Bruffee,et al.  The Brooklyn Plan: Attaining Intellectual Growth through Peer-Group Tutoring. , 1978 .