Electronic synoptic operative reporting: assessing the reliability and completeness of synoptic reports for pancreatic resection.

BACKGROUND Electronic synoptic operative reports (E-SORs) have replaced dictated reports at many institutions, but whether E-SORs adequately document the components and findings of an operation has received limited study. This study assessed the reliability and completeness of E-SORs for pancreatic surgery developed at our institution. STUDY DESIGN An attending surgeon and surgical fellow prospectively and independently completed an E-SOR after each of 112 major pancreatic resections (78 proximal, 29 distal, and 5 central) over a 10-month period (September 2008 to June 2009). Reliability was assessed by calculating the interobserver agreement between attending physician and fellow reports. Completeness was assessed by comparing E-SORs to a case-matched (surgeon and procedure) historical control of dictated reports, using a 39-item checklist developed through an internal and external query of 13 high-volume pancreatic surgeons. RESULTS Interobserver agreement between attending and fellow was moderate to very good for individual categorical E-SOR items (kappa = 0.65 to 1.00, p < 0.001 for all items). Compared with dictated reports, E-SORs had significantly higher completeness checklist scores (mean 88.8 +/- 5.4 vs 59.6 +/- 9.2 [maximum possible score, 100], p < 0.01) and were available in patients' electronic records in a significantly shorter interval of time (median 0.5 vs 5.8 days from case end, p < 0.01). The mean time taken to complete E-SORs was 4.0 +/- 1.6 minutes per case. CONCLUSIONS E-SORs for pancreatic surgery are reliable, complete in data collected, and rapidly available, all of which support their clinical implementation. The inherent strengths of E-SORs offer real promise of a new standard for operative reporting and health communication.

[1]  L. Cyr,et al.  Measures of clinical agreement for nominal and categorical data: the kappa coefficient. , 1992, Computers in biology and medicine.

[2]  Jason Park,et al.  Structured assessment format for evaluating operative reports in general surgery. , 2008, American journal of surgery.

[3]  D. Cook,et al.  Current concepts in validity and reliability for psychometric instruments: theory and application. , 2006, The American journal of medicine.

[4]  W J Temple,et al.  Opening the black box of cancer surgery quality: WebSMR and the Alberta experience , 2009, Journal of surgical oncology.

[5]  Carolyn Ford,et al.  Electronic Templates versus Dictation for the Completion of Mohs Micrographic Surgery Operative Notes , 2007, Dermatologic surgery : official publication for American Society for Dermatologic Surgery [et al.].

[6]  J. Srigley,et al.  Standardized synoptic cancer pathology reporting: A population‐based approach , 2009, Journal of surgical oncology.

[7]  Michael B Flynn,et al.  The operative note as billing documentation: a preliminary report. , 2004, The American surgeon.

[8]  A. Chambers,et al.  Improvement in the accuracy of reporting key prognostic and anatomic findings during thyroidectomy by using a novel Web-based synoptic operative reporting system. , 2009, Surgery.

[9]  R. Zarbo Interinstitutional assessment of colorectal carcinoma surgical pathology report adequacy. A College of American Pathologists Q-Probes study of practice patterns from 532 laboratories and 15,940 reports. , 1992, Archives of pathology & laboratory medicine.

[10]  S. Khuri,et al.  The NSQIP: a new frontier in surgery. , 2005, Surgery.

[11]  Clement J. McDonald,et al.  Efficiency, Comprehensiveness and Cost-effectiveness when comparing Dictation and Electronic Templates for Operative Reports , 2005, AMIA.

[12]  Brian J Smith,et al.  Operative Note Dictation: Should It Be Taught Routinely in Residency Programs? , 2004, Obstetrics and gynecology.

[13]  K. Dickersin,et al.  Comparison of information obtained by operative note abstraction with that recorded on a standardized data collection form. , 2003, Surgery.

[14]  J. DeOrio Surgical templates for orthopedic operative reports. , 2002, Orthopedics.

[15]  Roger Chow,et al.  Value of Debriefing during Simulated Crisis Management: Oral versus Video-assisted Oral Feedback , 2006, Anesthesiology.

[16]  Liviu P Lefter,et al.  AN AUDIT OF OPERATIVE NOTES: FACTS AND WAYS TO IMPROVE * , 2008, ANZ journal of surgery.

[17]  A. Harvey,et al.  Comparison of data extraction from standardized versus traditional narrative operative reports for database-related research and quality control. , 2007, Surgery.

[18]  I. Edhemovic,et al.  The Computer Synoptic Operative Report—A Leap Forward in the Science of Surgery , 2004 .

[19]  M. Lynn Determination and quantification of content validity. , 1986, Nursing research.

[20]  Surgical Reporting Instrument Designed to Improve Outcome Data in Head and Neck Cancer Trials , 1994, The Annals of otology, rhinology, and laryngology.

[21]  W J Temple,et al.  Synoptic operative record for point of care outcomes: a leap forward in knowledge translation. , 2010, European journal of surgical oncology : the journal of the European Society of Surgical Oncology and the British Association of Surgical Oncology.

[22]  J. Fleiss,et al.  Intraclass correlations: uses in assessing rater reliability. , 1979, Psychological bulletin.

[23]  Yuri W. Novitsky,et al.  Prospective, Blinded Evaluation of Accuracy of Operative Reports Dictated by Surgical Residents , 2005, The American surgeon.