The Development of a 360-degree Evaluation Model of Medical Curriculum with the Kirkpatrick Hierarchy Approach

A 360-degree evaluation model of the medical curriculum was developed by adhering to the Kirkpatrick Hierarchy. To bridge the research gap pertaining to instrument development of 360-degree evaluation, a set of tools derived from various stakeholders’ perspectives was used. This mixed-method study in the sequential exploratory design involved 797 participants for two years, April 2017 to November 2019. The three study phases involved were: the qualitative phase (18 informants), the item construction phase with exploratory factor analysis (EFA) (298 participants), and the validation phase via confirmatory factor analysis (CFA) (481 participants by random sampling). In-depth interview sessions were conducted with two stakeholder groups; lecturers and preceptors. Focus group discussion (FGD) sessions were conducted with two other groups; medical students and patients at the teaching hospital. The item construction phase was executed based on the themes that emerged from qualitative findings. Lastly, the final validation stage was performed based on the EFA results. In total, 23 themes were derived from four stakeholder groups. In the item construction phase, 13 and 10 factors were identified for lecturer and student instruments, respectively. To build the preceptor and patient tools, 10 scales were used to validate the item constructs. Thus, the 360-degree evaluation model has four levels of Kirkpatrick Hierarchy with four instrument models. The 360-degree evaluation model is valid, well-constructed, and accurately reflects the indicator variables. The evaluation model is feasible and acceptable to assess the medical curriculum.

[1]  Mia Kusmiati,et al.  Development of an Instrument for Preceptor Evaluation of Medical Graduates’ Performance: the Psychometric Properties , 2019, Medical Science Educator.

[2]  M. Amini,et al.  Using Kirkpatrick’s model to measure the effect of a new teaching and learning methods workshop for health care staff , 2019, BMC Research Notes.

[3]  Mia Kusmiati,et al.  Validation of Patient Perception Instruments for Junior Doctor Performance: a Factor Analysis , 2019, Global Medical & Health Communication (GMHC).

[4]  Y. Kwan,et al.  A Systematic Review of the Quality and Utility of Observer-Based Instruments for Assessing Medical Professionalism. , 2018, Journal of graduate medical education.

[5]  Anna Lee,et al.  Medical students and professionalism – Do the hidden curriculum and current role models fail our future doctors? , 2017, Medical teacher.

[6]  Kambiz Ahmadi Angali,et al.  Assessing Reliability and Validity of an Instrument for Measuring Resilience Safety Culture in Sociotechnical Systems , 2017, Safety and health at work.

[7]  J. Frank,et al.  Core principles of assessment in competency-based medical education , 2017, Medical teacher.

[8]  D. Dolmans,et al.  Medical professionalism frameworks across non-Western cultures: A narrative overview , 2017, Medical teacher.

[9]  S. Hamstra,et al.  Parting the Clouds: Three Professionalism Frameworks in Medical Education. , 2016, Academic medicine : journal of the Association of American Medical Colleges.

[10]  S. Rinnert,et al.  Utility of 360-degree assessment of residents in a Turkish academic emergency medicine residency program , 2016, Turkish journal of emergency medicine.

[11]  J. Schönrock-Adema,et al.  Development and Validation of the Scan of Postgraduate Educational Environment Domains (SPEED): A Brief Instrument to Assess the Educational Environment in Postgraduate Medical Education , 2015, PloS one.

[12]  P. Mueller,et al.  Teaching and Assessing Professionalism in Medical Learners and Practicing Physicians , 2015, Rambam Maimonides medical journal.

[13]  Véronique Sébille,et al.  Sample size used to validate a scale: a review of publications on newly-developed patient reported outcomes measures , 2014, Health and Quality of Life Outcomes.

[14]  Maureen D. Francis,et al.  Determinants of Patient Satisfaction in Internal Medicine Resident Continuity Clinics: Findings of the Educational Innovations Project Ambulatory Collaborative. , 2014, Journal of graduate medical education.

[15]  Jonathan Sherbino,et al.  Defining the Key Roles and Competencies of the Clinician–Educator of the 21st Century: A National Mixed-Methods Study , 2014, Academic medicine : journal of the Association of American Medical Colleges.

[16]  Julie K. Johnson,et al.  A practical example of Contribution Analysis to a public health intervention , 2014 .

[17]  Herbert W Marsh,et al.  Exploratory structural equation modeling: an integration of the best features of exploratory and confirmatory factor analysis. , 2014, Annual review of clinical psychology.

[18]  J. V. van Merrienboer,et al.  A Delphi study of medical professionalism in Arabian countries: The Four-Gates model , 2014, Medical teacher.

[19]  O. H. Kasule,et al.  Medical professionalism and professional organizations , 2013 .

[20]  R. Grol,et al.  Evaluation and feedback for effective clinical teaching in postgraduate medical education: Validation of an assessment instrument incorporating the CanMEDS roles , 2012, Medical teacher.

[21]  Valerie Ruhe,et al.  The 2011 Program Evaluation Standards: a framework for quality in medical education programme evaluations. , 2012, Journal of evaluation in clinical practice.

[22]  Ann W Frye,et al.  Program evaluation models and related theories: AMEE Guide No. 67 , 2012, Medical teacher.

[23]  C. V. D. van der Vleuten,et al.  Doctor–patient communication in Southeast Asia: a different culture? , 2012, Advances in health sciences education : theory and practice.

[24]  S. Sanchez-Reilly,et al.  Hospice and palliative medicine: curriculum evaluation and learner assessment in medical education. , 2012, Journal of palliative medicine.

[25]  H. Hult,et al.  What does it mean to be a good teacher and clinical supervisor in medical education? , 2011, Advances in health sciences education : theory and practice.

[26]  Tim Swanwick,et al.  Understanding Medical Education: Evidence, Theory and Practice, 1st edn , 2011 .

[27]  A. Muijtjens,et al.  Exploring the validity and reliability of a questionnaire for evaluating veterinary clinical teachers’ supervisory skills during clinical rotations , 2011, Medical teacher.

[28]  W. Levinson,et al.  A behavioral and systems view of professionalism. , 2010, JAMA.

[29]  C. Hutchison,et al.  Musculoskeletal education: a curriculum evaluation at one university , 2010, BMC medical education.

[30]  Thuraya A. Al-Shidhani,et al.  Curriculum Development for Medical Education: A Six-Step Approach. , 2010 .

[31]  Diana H J M Dolmans,et al.  The Maastricht Clinical Teaching Questionnaire (MCTQ) as a Valid and Reliable Instrument for the Evaluation of Clinical Teachers , 2010, Academic medicine : journal of the Association of American Medical Colleges.

[32]  M. Tavakol,et al.  Using evaluation research to improve medical education , 2010, The clinical teacher.

[33]  M. Mason Sample Size and Saturation in PhD Studies Using Qualitative Interviews , 2010 .

[34]  T. Fukui,et al.  Professionalism Mini‐Evaluation Exercise for medical residents in Japan: a pilot study , 2009, Medical education.

[35]  Tim J. Wilkinson,et al.  A Blueprint to Assess Professionalism: Results of a Systematic Review , 2009, Academic medicine : journal of the Association of American Medical Colleges.

[36]  R. Duvivier,et al.  Teacher perceptions of desired qualities, competencies and strategies for clinical skills teachers , 2009, Medical teacher.

[37]  H. Kersten,et al.  Curriculum structure: principles and strategy. , 2008, European journal of dental education : official journal of the Association for Dental Education in Europe.

[38]  S. Barrie,et al.  Students’ perceptions of teaching quality in higher education: the perspective of currently enrolled students , 2007 .

[39]  Ralph Reimann,et al.  Evaluation of Curricula in Higher Education , 2006, Evaluation review.

[40]  Jason W. Osborne,et al.  Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. , 2005 .

[41]  Reid Bates A critical analysis of evaluation practice: the Kirkpatrick model and the principle of beneficence , 2004 .

[42]  L. Arnold,et al.  Assessing Professional Behavior: Yesterday, Today, and Tomorrow , 2002, Academic medicine : journal of the Association of American Medical Colleges.

[43]  Michael Guolla,et al.  STUDENT SATISFACTION RELATIONSHIP: APPLIED CUSTOMER SATISFACTION RESEARCH IN THE CLASSROOM , 1999 .

[44]  Peter Cabrera-Nguyen Author Guidelines for Reporting Scale Development and Validation Results in the Journal of the Society for Social Work and Research , 2010, Journal of the Society for Social Work and Research.