Programmatic assessment of competency-based workplace learning: when theory meets practice

BackgroundIn competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice.MethodsIn a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned.ResultsThe programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges.ConclusionsA programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme.

[1]  Steven M. Downing,et al.  Assessment in Health Professions Education , 2009 .

[2]  Annemarie Spruijt,et al.  Best abstracts of the NVMO conference , 2013, Perspectives on Medical Education.

[3]  C. V. D. van der Vleuten,et al.  The process of feedback in workplace‐based assessment: organisation, delivery, continuity , 2012, Medical education.

[4]  C. V. D. van der Vleuten,et al.  Programmatic assessment: From assessment of learning to assessment for learning , 2011, Medical teacher.

[5]  H. Cameron,et al.  Peer assisted learning: a planning and implementation framework: AMEE Guide no. 30 , 2007, Medical teacher.

[6]  Elaine F Dannefer,et al.  The portfolio approach to competency-based assessment at the Cleveland Clinic Lerner College of Medicine. , 2007, Academic medicine : journal of the Association of American Medical Colleges.

[7]  D. Long,et al.  Competency-based medical education: theory to practice , 2010, Medical teacher.

[8]  C. Carraccio,et al.  Shifting Paradigms: From Flexner to Competencies , 2002, Academic medicine : journal of the Association of American Medical Colleges.

[9]  C. V. D. van der Vleuten,et al.  Development and validation of a competency framework for veterinarians. , 2011, Journal of veterinary medical education.

[10]  Dominique Sluijsmans,et al.  Toward a Synthesis of Cognitive Load Theory, Four-Component Instructional Design, and Self-Directed Learning , 2009 .

[11]  J. Dijkstra,et al.  A new framework for designing programmes of assessment , 2009, Advances in health sciences education : theory and practice.

[12]  L W T Schuwirth,et al.  When enough is enough: a conceptual basis for fair and defensible practice performance assessment , 2002, Medical education.

[13]  Mark Wilson,et al.  From Principles to Practice: An Embedded Assessment System , 2000 .

[14]  C. Pope,et al.  Qualitative methods in research on healthcare quality , 2002, Quality & safety in health care.

[15]  Val Wass,et al.  Portfolios in medical education: why do they meet with mixed success? A systematic review , 2007, Medical education.

[16]  C. Ringsted,et al.  Person–task–context: a model for designing curriculum and in-training assessment in postgraduate education , 2006, Medical teacher.

[17]  C. V. D. van der Vleuten,et al.  A model for programmatic assessment fit for purpose , 2012, Medical teacher.

[18]  Olle Ten Cate,et al.  Peer teaching in medical education: twelve reasons to move from theory to practice , 2007, Medical Teacher.

[19]  Daniel Koretz,et al.  Large‐scale Portfolio Assessments in the US: evidence pertaining to the quality of measurement , 1998 .

[20]  J. Kogan,et al.  Implementing feedback cards in core clerkships , 2008, Medical education.

[21]  Allan Collins,et al.  Design Research: Theoretical and Methodological Issues , 2004 .

[22]  C. Ringsted,et al.  Educational impact of in‐training assessment (ITA) in postgraduate medical education: a qualitative study of an ITA programme in actual practice , 2004, Medical education.

[23]  E. Hundert,et al.  Defining and assessing professional competence. , 2002, JAMA.

[24]  C. V. D. van der Vleuten,et al.  Assessing professional competence : from methods to programmes , 2005 .

[25]  C. V. D. van der Vleuten,et al.  The assessment of professional competence: Developments, research and practical implications. , 1996, Advances in health sciences education : theory and practice.

[26]  C. Vleuten,et al.  A closer look at cueing effects in multiple‐choice questions , 1996, Medical education.

[27]  J. Norcini,et al.  Facing the challenges of competency‐based assessment of postgraduate dental training: Longitudinal Evaluation of Performance (LEP) , 2002, Medical education.

[28]  D. Prozesky,et al.  Assessment of Learning , 2020, The SAGE Encyclopedia of Higher Education.

[29]  J van Tartwijk,et al.  The use of qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: a case study , 2005, Medical education.

[30]  Eric S. Holmboe,et al.  Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. , 2009, JAMA.

[31]  C. Ringsted,et al.  Embracing the new paradigm of assessment in residency training: an assessment programme for first-year residency training in anaesthesiology , 2003, Medical teacher.

[32]  D. Prideaux Curriculum development in medical education: From acronyms to dynamism , 2007 .

[33]  BMC Medical Education , 2006 .

[34]  J. Norcini,et al.  Workplace-based assessment as an educational tool: AMEE Guide No. 31 , 2007, Medical teacher.

[35]  Vanessa C. Burch,et al.  Workplace-based assessment as an educational tool , 2007 .

[36]  C. Vleuten Programmatic assessment: From assessment of learning to assessment for learning , 2011 .

[37]  C. V. D. Vleuten,et al.  The assessment of professional competence: Developments, research and practical implications , 1996 .