For decades, critics have contested the value of continuing medical education (CME), as well as the efficacy of many CME studies, pointing to methodologic flaws that preclude valid assessment. Although testing for knowledge before and after participation in a given educational program usually shows that physicians learn facts,1the effect of education on performance in practice and ultimately on patient care has been difficult to establish. Nevertheless, direct observations indicate that, after graduation, physicians incorporate new skills and new knowledge into their practices. In 1973, Caplan2pointed out, "Scientific proof of effectiveness rests upon varying types of evidence." Citing an old example, he asked, "How many physicians whose formal medical education occurred pre-1940 today treat syphilis with arsphenamine [Salvarsan] or lobar pneumonia with type-specific antisera? Their switch to penicillin (behavior) testifies that somehow they learned something new (education)." Numerous modern examples corroborate such adoption of medical advances. Cardiologists
[1]
N J Gilman,et al.
Determining educational needs in the physician's office.
,
1980,
JAMA.
[2]
Phil R. Manning,et al.
Medicine : preserving the passion
,
1987
.
[3]
P R Manning,et al.
Continuing medical education. The next step.
,
1983,
JAMA.
[4]
P R Manning,et al.
The past, present, and future of continuing medical education. Achievements and opportunities, computers and recertification.
,
1987,
JAMA.
[5]
B. Starfield,et al.
Effectiveness of pediatric care: the relationship between processes and outcome.
,
1972,
Pediatrics.
[6]
R. Haynes,et al.
Evidence for the effectiveness of CME. A review of 50 randomized controlled trials.
,
1992,
JAMA.
[7]
G. Miller,et al.
Continuing education and patient care research. Physician response to screening test results.
,
1967,
JAMA.
[8]
Lawrence W. Green,et al.
Health Education Planning: A Diagnostic Approach
,
1979
.