For many education providers, student engagement can be a major issue. Given the positive correlation between engagement and good performance, providers are continually looking for ways to engage students in the learning process. The growth of student digital literacy, the wide proliferation of online tools and the understanding of why online gaming can be addictive have combined to create a set of tools that providers can leverage to enhance engagement. One such tool is Peerwise, https://peerwise.cs.auckland.ac.nz/, an online, multiple choice question (MCQ) and answer tool in which students create questions that are answered by other students. Why use MCQs? Using MCQs tests knowledge, provides reassurance of learning, identifies gaps and makes this data available to student and provider. Students use this information to focus their time on areas requiring additional work [1], benefiting from the early feedback provided. Formative assess- ments using MCQs are beneficial in preparing students for summative testing and are appreciated and liked by students [2]. Providers can use this information to determine how the material is being received and react accordingly. Students use Peerwise to create MCQs that are answered, rated and commented on by their peers. Students’ engagement in Peerwise earns trophies for contributing regular use and for providing feedback, all of which act to stimulate further engagement, using the principles of gamification. Bournemouth University, a public university in the UK with over 18,000 students, has been embedding Peerwise in under-graduate and post-graduate units since 2014. The results experienced by Bournemouth University have been beneficial and correlate with other studies of using Peerwise [3] [4]. A statistically significant improvement was seen by one cohort of students compared to the previous year where Peerwise was not used. However, no correlation was found between Peerwise participation and a student’s unit mark. The processes followed by Bournemouth University and the advantages and disadvantages, backed by qualitative and quantitative data, will be presented so that other institutions can gain an informed view of the merits of Peerwise for their own teaching and learning environments.
[1]
D. Hounsell.
Towards more sustainable feedback to students
,
2007
.
[2]
Paul Denny.
Motivating online collaborative learning
,
2010,
ITiCSE '10.
[3]
Stephen Swailes,et al.
The Dimensionality of Honey and Mumford’s Learning Styles Questionnaire
,
1999
.
[4]
M. Hanrahan.
The effect of learning environment factors on students' motivation and learning
,
1998
.
[5]
Susan Bloxham,et al.
The busy teacher educator’s guide to assessment
,
2007
.
[6]
Diana Percy,et al.
The Process of Learning
,
1989
.
[7]
D. Krathwohl.
A Taxonomy for Learning, Teaching and Assessing:
,
2008
.
[8]
Stephen W. Draper,et al.
Catalytic assessment: understanding how MCQs and EVS can foster deep learning
,
2009,
Br. J. Educ. Technol..
[9]
John Hamer,et al.
The quality of a PeerWise MCQ repository
,
2010,
ACE '10.
[10]
John Hamer,et al.
Student use of the PeerWise system
,
2008,
ITiCSE.
[11]
Beryl Plimmer,et al.
Activities, affordances and attitude: how student-generated questions assist learning
,
2012,
ITiCSE '12.
[12]
Michael Fielding,et al.
Students as Radical Agents of Change
,
2001
.
[13]
Quintin I. Cutts,et al.
Peer instruction
,
2012,
Commun. ACM.
[14]
Beth Simon,et al.
Quality of student contributed questions using PeerWise
,
2009,
ACE '09.
[15]
Noel Entwistle,et al.
Promoting deep learning through teaching and assessment: conceptual frameworks and educational contexts.
,
2000
.
[16]
Paul W. Foos,et al.
Effects of Student-Written Questions on Student Test Performance
,
1989
.
[17]
G. Gibbs,et al.
Conditions Under Which Assessment Supports Students’ Learning
,
2005
.
[18]
John Hamer,et al.
Student use of the PeerWise system
,
2008,
SIGCSE 2008.