Professional practice assessment in ambulatory private rheumatology: a pilot evaluation of the medical file content for rheumatoid arthritis.

OBJECTIVE Professional Practice Assessment (PPA) has become an obligation for all physicians in France, however its modalities remain unclear. The objective of this work was to evaluate the feasibility and accuracy of a PPA for private practice rheumatologists performed in the context of a network. METHODS A list of items considered mandatory to collect during an outpatient visit for rheumatoid arthritis, was prepared by the network. Non hospital-based rheumatologists, members of the network then evaluated some of their patient files selected by chronological order over a one-month period of time using this list. These files were then assessed by another private rheumatologist, member of the group, randomly allocated, using the same list of items. RESULTS Eighty percent of the private-practice doctors accepted to participate. The mean time to evaluate 15 patient files was 2 hours. Agreement between auto-evaluation and external evaluation for each file was good (agreement statistic, 0.75-1.0). Items mandatory to collect were collected in a high proportion of cases (84.6%). CONCLUSION PPA can be performed in the context of a network, auto-evaluation is a valid method and when the list of items is decided on by the network, the data are collected satisfactorily.

[1]  M. Dougados,et al.  Practice patterns in outpatient rheumatology: a pilot evaluation of medical file content. , 2007, Joint, bone, spine : revue du rhumatisme.

[2]  Michael Fordis,et al.  Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. , 2006, JAMA.

[3]  E. Holmboe,et al.  Self-assessment in lifelong learning and improving performance in practice: physician know thyself. , 2006, JAMA.

[4]  Kevin W Eva,et al.  Self-Assessment in the Health Professions: A Reformulation and Research Agenda , 2005, Academic medicine : journal of the Association of American Medical Colleges.

[5]  R Baker,et al.  What do we know about how to do audit and feedback? Pitfalls in applying evidence from a systematic review , 2005, BMC health services research.

[6]  L. Barnsley,et al.  Clinical skills in junior medical officers: a comparison of self‐reported confidence and observed competence , 2004, Medical education.

[7]  E. Hundert,et al.  Defining and assessing professional competence. , 2002, JAMA.

[8]  J. R. Landis,et al.  The measurement of observer agreement for categorical data. , 1977, Biometrics.

[9]  R. Alpert,et al.  Communications Through Limited-Response Questioning , 1954 .

[10]  N. Freemantle,et al.  Audit and feedback: effects on professional practice and health care outcomes. , 2000, The Cochrane database of systematic reviews.

[11]  A. Feinstein,et al.  High agreement but low kappa: I. The problems of two paradoxes. , 1990, Journal of clinical epidemiology.

[12]  E. Rosenow The medical knowledge self-assessment program. , 1969, Journal of medical education.