Modelling and managing student satisfaction: use of student feedback to enhance learning experience

In 2014-15, following a call for expressions of interest open to its subscribers, QAA commissioned six small-scale primary research projects intended to encourage collaboration between providers and promote the formation of communities of practice. This report is one of two on the topic of the role of student satisfaction data in quality assurance and enhancement. The reports are not QAA documents, so we have respected the authors' approach in terms of style and presentation. We hope that you will read them with interest. Other topics in the series are the transition experiences of entrants to higher education from increasingly diverse prior educational experiences; and an impact study of the guidance documents for higher education providers published by QAA in 2013. The 2015 National Student Survey results released recently have highlighted that student satisfaction scores have not increased despite an increase in tuition fees (Havergal, 2015). Understanding the key enablers and barriers for integrating student satisfaction data with Quality Assurance (QA) and Quality Enhancement (QE) was a key focus of this small-scale research project. By combining a qualitative perspective (that is, literature review with integrated perspective of academics and academic-related staff) and quantitative perspective (that is, using a case study at the Open University (OU) as to what the key drivers of learning satisfaction were among 60,000 students), we have found five key challenges for HE. Most UK institutions now systematically collect learning satisfaction. Nonetheless, there remain several critics about the appropriateness of these questionnaires argue that most learning satisfaction instruments are teacher-centred, focusing on what the instructor does in the learning environment, rather than what learners actually do, how they engage and whether learning occurred. While many institutions have become reasonably skilled in collecting large amounts of student satisfaction data, making sense of rich data sources and acting upon the data is complex and cumbersome. Recently several studies have tried to close the loop. For example, Arbaugh (2014) and Rienties, Toetenel, and Bryan (2015) found across 40+ modules that learning design and teaching support in particular influenced learners' satisfaction. In our case study, using logistic regression modelling of 200 potential explanatory variables with 60K+ students we addressed the key drivers for students' learning satisfaction. Findings indicated that learning design had a strong and significant impact on overall satisfaction. Learners who were more satisfied with the quality of teaching materials, assessment strategies, and workload were significantly more satisfied with the overall …

[1]  J. B. Arbaugh,et al.  System, scholar or students? Which most influences online MBA course effectiveness? , 2014, J. Comput. Assist. Learn..

[2]  Roy D. Pea,et al.  Educational data scientists: a scarce breed , 2013, LAK '13.

[3]  Sean B. Eom,et al.  The Determinants of Students' Perceived Learning Outcomes and Satisfaction in University Online Education: An Empirical Investigation* , 2006 .

[4]  Agustín C. Caminero,et al.  Analyzing the students' behavior and relevant topics in virtual learning communities , 2014, Comput. Hum. Behav..

[5]  Clem Herman,et al.  Returning to STEM: gendered factors affecting employability for mature women students , 2014 .

[6]  J. Arbaugh,et al.  A Structural Equation Model of Predictors for Effective Online Learning , 2005 .

[7]  Qing Gu,et al.  Learning and growing in a ‘foreign’ context: intercultural experiences of international students , 2010 .

[8]  Herbert W. Marsh,et al.  SEEQ: A Reliable, Valid, and Useful Instrument for Collecting Students' Evaluations of University Teaching. , 1982 .

[9]  John Hattie,et al.  Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement , 2008 .

[10]  Bart Rienties,et al.  Implementing a Learning Analytics Intervention and Evaluation Framework: what works? , 2017 .

[11]  Tena B. Crews,et al.  Online Course Evaluations: Faculty Perspective and Strategies for Improved Response Rates , 2011 .

[12]  D. Paulhus Measurement and control of response bias. , 1991 .

[13]  Bart Rienties,et al.  "Scaling up" learning design: impact of learning design activities on LMS behavior and performance , 2015, LAK.

[14]  Nancy Blattner,et al.  Guarding Against Potential Bias in Student Evaluations: What Every Faculty Member Needs to Know , 2003 .

[15]  Talitha Bennett,et al.  The Move to a System of Flexible Delivery Mode (Online v Paper) Unit of Study Student Evaluations at Flinders University. Management Issues and the Study of Initial Changes in Survey Volume, Response Rate and Response Level. , 2010 .

[16]  Young Ik Cho,et al.  The Relation Between Culture and Response Styles , 2005 .

[17]  Carol Calvert Developing a model and applications for probabilities of student success: a case study of predictive analytics , 2014 .

[18]  Dirk T. Tempelaar,et al.  In search for the most informative data for feedback generation: Learning analytics in a data-rich context , 2015, Comput. Hum. Behav..

[19]  John T. E. Richardson,et al.  The role of response biases in the relationship between students’ perceptions of their courses and their approaches to studying in higher education , 2012 .

[20]  Sarah J. Stein,et al.  Can you increase teacher engagement with evaluation simply by improving the evaluation system? , 2016 .

[21]  Bart Rienties,et al.  Understanding academics’ resistance towards (online) student evaluation , 2014 .

[22]  John T. E. Richardson,et al.  Approaches to studying across the adult life span: Evidence from distance education , 2013 .

[23]  Zdenek Zdráhal,et al.  Developing predictive models for early detection of at-risk students on distance learning modules , 2014, LAK Workshops.

[24]  P. Ramsden A performance indicator of teaching quality in higher education: The Course Experience Questionnaire , 1991 .

[25]  Francisco J. García-Peñalvo,et al.  Human-computer interaction in evolutionary visual software analytics , 2013, Comput. Hum. Behav..

[26]  Fengfeng Ke,et al.  Toward deep learning for adult students in online courses , 2009, Internet High. Educ..

[27]  David W. Hosmer,et al.  Applied Logistic Regression , 1991 .

[28]  Jordan J. Titus Student Ratings in a Consumerist Academy: Leveraging Pedagogical Control and Authority , 2008 .

[29]  Helen Gartley,et al.  The Quality Assurance Agency for Higher Education , 2002 .

[30]  J. Beishuizen,et al.  Student learning experience as indicator of teaching quality , 2012 .

[31]  Rebecca Ferguson,et al.  Innovating Pedagogy 2014 , 2014 .

[32]  G. Gibbs,et al.  The Evaluation of the Student Evaluation of Educational Quality Questionnaire (SEEQ) in UK Higher Education , 2001 .

[33]  Richard E. Mayer,et al.  Multimedia Learning: The Promise of Multimedia Learning , 2001 .

[34]  Carolyn Penstein Rosé,et al.  Learning analytics and machine learning , 2014, LAK.

[35]  A. Agresti An introduction to categorical data analysis , 1997 .

[36]  Gráinne Conole,et al.  Designing for Learning in an Open World , 2012 .

[37]  Alan Woodley,et al.  National student feedback surveys in distance education: an investigation at the UK Open University , 2011 .

[38]  Spyros Konstantopoulos,et al.  Computing Power of Tests of the Variance of Treatment Effects in Designs With Two Levels of Nesting , 2008, Multivariate behavioral research.

[39]  M. S. Patel,et al.  An introduction to meta-analysis. , 1989, Health Policy.

[40]  Dirk T. Tempelaar,et al.  A review of the role of information communication technology and course design in transitional education practices , 2012, Interact. Learn. Environ..

[41]  Kathleen M. T. Collins,et al.  Students’ Perceptions of Characteristics of Effective College Teachers: A Validity Study of a Teaching Evaluation Form Using a Mixed-Methods Analysis , 2007 .

[42]  Feng Li,et al.  An Introduction to Metaanalysis , 2005 .