Objective structured clinical examination (OSCE) revisited

Objective structured clinical examination (OSCE) was introduced in 1975 as a standardized tool for objectively assessing clinical competencies — including history-taking, physical examination, communication skills, data interpretation etc. It consists of a circuit of stations connected in series, with each station devoted to assessment of a particular competency using pre-determined guidelines or checklists. OSCE has been used as a tool for both formative and summative evaluation of medical graduate and postgraduate students across the globe. The use of OSCE for formative assessment has great potential as the learners can gain insights into the elements making up clinical competencies as well as feedback on personal strengths and weaknesses. However, the success of OSCE is dependent on adequacy of resources, including the number of stations, construction of stations, method of scoring (checklists and/or global scoring), the number of students assessed, and adequate time and money. Lately, OSCE has drawn some criticism for its lack of validity, feasibility, practicality, and objectivity. There is evidence to show that many OSCEs may be too short to achieve reliable results. There are also currently no clear cut standards set for passing an OSCE. It is perceived that OSCEs test the student’s knowledge and skills in a compartmentalized fashion, rather than looking at the patient as a whole. This article focuses on the issues of validity, objectivity, reliability, and standard setting of OSCE. Presently, the Indian experiences with OSCE are limited and there is a need to sensitise the Indian faculty and students. A cautious approach is desired before it is considered as a supplementary tool to other methods of assessment for the summative examinations in Indian settings.

[1]  C. V. D. van der Vleuten,et al.  A Long Case , 1888, The Hospital.

[2]  B E Mavis,et al.  The emperor's new clothes: the OSCE reassessed , 1996, Academic medicine : journal of the Association of American Medical Colleges.

[3]  R. Reznick,et al.  Process-rating forms versus task-specific checklists in an OSCE for medical licensure. Medical Council of Canada. , 1998, Academic medicine : journal of the Association of American Medical Colleges.

[4]  Cees P. M. van der Vleuten,et al.  Assessing professional competence: from methods to programmes , 2005 .

[5]  A. Barman Critiques on the Objective Structured Clinical Examination. , 2005, Annals of the Academy of Medicine, Singapore.

[6]  P. Gupta,et al.  Practical approach to running an objective structured clinical examination in neonatology for formative assessment of medical undergraduates. , 2001, Indian pediatrics.

[7]  David Newble,et al.  Techniques for measuring clinical competence: objective structured clinical examinations , 2004, Medical education.

[8]  T. Roberts,et al.  Using borderline methods to compare passing standards for OSCEs at graduation across three medical schools , 2007, Medical education.

[9]  K Walters,et al.  The development, validity and reliability of a multimodality objective structured clinical examination in psychiatry , 2005, Medical education.

[10]  M. Cusimano Standard setting in medical education , 1996, Academic medicine : journal of the Association of American Medical Colleges.

[11]  Karen J. Panzarella,et al.  A model for integrated assessment of clinical competence. , 2007, Journal of allied health.

[12]  M. Natu,et al.  Objective structured practical examination(OSPE) in pharmacology-student’s point of view , 1994 .

[13]  Standard setting in an objective structured clinical examination: use of global ratings of borderline performance to determine the passing score , 2002, Medical education.

[14]  E. Hundert,et al.  Defining and assessing professional competence. , 2002, JAMA.

[15]  K. Connell,et al.  Psychometric refinement of an ACGME-based clinical assessment tool.: RSVP: Accreditation Council for Graduate Medical Education (ACGME) Outcome Project , 2004 .

[16]  R. Harden,et al.  Assessment of clinical competence using objective structured examination. , 1975, British medical journal.

[17]  D. Newble,et al.  Psychometric characteristics of the objective structured clinical examination , 1988, Medical education.

[18]  R. Linn Educational measurement, 3rd ed. , 1989 .

[19]  A. Epstein,et al.  Pay for performance at the tipping point. , 2007, The New England journal of medicine.

[20]  R. Reznick,et al.  Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE‐format examination , 1998, Academic medicine : journal of the Association of American Medical Colleges.

[21]  S. Downing,et al.  Item Analysis to Improve Reliability for an Internal Medicine Undergraduate OSCE , 2005, Advances in health sciences education : theory and practice.

[22]  D. Blackmore,et al.  Setting standards for an objective structured clinical examination: the borderline group method gains ground on Angoff. , 2001 .

[23]  T. Roberts,et al.  How to set up an OSCE , 2005 .

[24]  D M Kaufman,et al.  A Comparison of Standard‐setting Procedures for an OSCE in Undergraduate Medical Education , 2000, Academic medicine : journal of the Association of American Medical Colleges.

[25]  Richard Reznick,et al.  Using the Judgments of Physician Examiners in Setting the Standards for a National Multi-center High Stakes OSCE , 1997, Advances in health sciences education : theory and practice.

[26]  D. Irby,et al.  Assessment in medical education. , 2007, The New England journal of medicine.

[27]  R. Epstein Assessment in medical education. , 2007, The New England journal of medicine.