An Examination of the Appropriateness of Using a Common Peer Assessment Instrument to Assess Physician Skills across Specialties

Problem Statement. To determine whether a common peer assessment instrument can assess competencies across internal medicine, pediatrics, and psychiatry specialties. Method. A common 36 item peer survey assessed psychiatry (n = 101), pediatrics (n = 100), and internal medicine (n = 103) specialists. Cronbach's alpha and generalizability analysis were used to assess reliability and factor analysis to address validity. Results. A total of 2,306 (94.8% response rate) surveys were analyzed. The Cronbach's alpha coefficient was .98. The generalizabililty coefficient (mean of 7.6 raters) produced an Ep2 = .83. Four factors emerged with a similar pattern of relative importance for pediatricians and internal medicine specialists whose first factor was patient management. Communication was the first factor for psychiatrists. Conclusions. Reliability and generalizability coefficient data suggest that using the instrument across specialties is appropriate, and differences in factors confirm the instrument's ability to discriminate for specialty differences providing evidence of validity.

[1]  E. Hundert,et al.  Defining and assessing professional competence. , 2002, JAMA.

[2]  R. Lipner,et al.  The Value of Patient and Peer Ratings in Recertification , 2002, Academic medicine : journal of the Association of American Medical Colleges.

[3]  C. Violato,et al.  Changing physicians' practices: the effect of individual feedback. , 1999, Academic medicine : journal of the Association of American Medical Colleges.

[4]  L. Barozzi,et al.  Medical Professionalism in the New Millennium: A Physician Charter , 2002, Annals of Internal Medicine.

[5]  C. Violato,et al.  Multisource feedback: a method of assessing surgical practice , 2003, BMJ : British Medical Journal.

[6]  C. Violato,et al.  Assessment of physician performance in Alberta: the physician achievement review. , 1999, CMAJ : Canadian Medical Association journal = journal de l'Association medicale canadienne.

[7]  The College of Physicians and Surgeons in Alberta. , 1929, Canadian Medical Association journal.

[8]  J. Carline,et al.  Use of peer ratings to evaluate physician performance. , 1993, JAMA.

[9]  D. Russell In Search of Underlying Dimensions: The Use (and Abuse) of Factor Analysis in Personality and Social Psychology Bulletin , 2002 .

[10]  S. Scheiber,et al.  The Implications of Core Competencies for Psychiatric Education and Practice in the US , 2003, Canadian journal of psychiatry. Revue canadienne de psychiatrie.

[11]  Karen Saperson,et al.  Residency Training: Challenges and Opportunities in Preparing Trainees for the 21st Century , 2003, Canadian journal of psychiatry. Revue canadienne de psychiatrie.

[12]  Robert Cudeck,et al.  Exploratory Factor Analysis , 2000 .

[13]  A. Levine Evaluation and Management of HIV-Infected Women , 2002, Annals of Internal Medicine.

[14]  Jocelyn Lockyer,et al.  Multisource feedback in the assessment of physician competencies. , 2003, The Journal of continuing education in the health professions.

[15]  K. Mann,et al.  Responses of Rural Family Physicians and Their Colleague and Coworker Raters to a Multi-Source Feedback Process: A Pilot Study , 2003, Academic medicine : journal of the Association of American Medical Colleges.

[16]  J. Norcini,et al.  Peer assessment of competence , 2003, Medical education.

[17]  Alberto Malliani,et al.  Medical professionalism in the new millennium: a physician charter. , 2002, Annals of internal medicine.

[18]  Isolda Tuhan Mastering CanMEDS Roles in Psychiatric Residency: A Resident's Perspective , 2003, Canadian journal of psychiatry. Revue canadienne de psychiatrie.

[19]  J. Carline,et al.  Feasibility of hospital‐based use of peer ratings to evaluate the performances of practicing physicians , 1996, Academic medicine : journal of the Association of American Medical Colleges.