Integrating Digital Pens in Breast Imaging for Instant Knowledge Acquisition

Future radiology practices assume that the radiology reports should be uniform, comprehensive, and easily managed. This means that reports must be readable to humans and machines alike. In order to improve reporting practices in breast imaging, we allow the radiologist to write structured reports with a special pen on paper with an invisible dot pattern. In this way, we provide a knowledge acquisition system for printed mammography patient forms for the combined work with printed and digital documents. In this domain, printed documents cannot be easily replaced by computer systems because they contain free-form sketches and textual annotations, and the acceptance of traditional PC reporting tools is rather low among the doctors. This is due to the fact that current electronic reporting systems significantly add to the amount of time it takes to complete the reports. We describe our real-time digital paper application and focus on the use case study of our deployed application. We think that our results motivate the design and implementation of intuitive pen-based user interfaces for the medical reporting process and similar knowledge work domains. Our system imposes only minimal overhead on traditional form-filling processes and provides for a direct, ontology-based structuring of the user input for semantic search and retrieval applications, as well as other applied artificial intelligence scenarios which involve manual form-based data acquisition.

[1]  Beat Signer,et al.  iGesture: A General Gesture Recognition Framework , 2007 .

[2]  F. Hall,et al.  The radiology report of the future. , 2009, Radiology.

[3]  Daniel Sonntag,et al.  Representing the International Classification of Diseases Version 10 in OWL , 2010, KEOD.

[4]  Daniel Sonntag,et al.  Incremental and Interaction-Based Knowledge Acquisition for Medical Images in THESEUS , 2013 .

[5]  James A. Pittman,et al.  Handwriting Recognition: Tablet PC Text Input , 2007, Computer.

[6]  Beryl Plimmer,et al.  Improving stylus interaction for eMedical forms , 2010, OZCHI '10.

[7]  Marcus Liwicki,et al.  MCS for Online Mode Detection: Evaluation on Pen-Enabled Multi-touch Interfaces , 2011, 2011 International Conference on Document Analysis and Recognition.

[8]  R. Marks,et al.  Validating electronic source data in clinical trials. , 2004, Controlled clinical trials.

[9]  Louis Vuurpijl,et al.  Mode detection in on-line pen drawing and handwriting recognition , 2005, Eighth International Conference on Document Analysis and Recognition (ICDAR'05).

[10]  C. Langlotz RadLex: a new method for indexing online educational materials. , 2006, Radiographics : a review publication of the Radiological Society of North America, Inc.

[11]  Christian Viard-Gaudin,et al.  On-line hand-drawn electric circuit diagram recognition using 2D dynamic programming , 2009, Pattern Recognit..

[12]  Daniel Sonntag,et al.  Collaborative Multimodality , 2012, KI - Künstliche Intelligenz.

[13]  Daniel L. Rubin,et al.  FMA-RadLex: An Application Ontology of Radiological Anatomy derived from the Foundational Model of Anatomy Reference Ontology , 2008, AMIA.

[14]  Donglin Zeng,et al.  A comparative study of mobile electronic data entry systems for clinical trials data collection , 2006, Int. J. Medical Informatics.

[15]  Max Mühlhäuser,et al.  CoScribe: Integrating Paper and Digital Documents for Collaborative Knowledge Work , 2009, IEEE Transactions on Learning Technologies.

[16]  R. Hutton,et al.  Applied cognitive task analysis (ACTA): a practitioner's toolkit for understanding cognitive task demands. , 1998, Ergonomics.

[17]  Emmanuel Augustin,et al.  A neural network-hidden Markov model hybrid for cursive word recognition , 1998, Proceedings. Fourteenth International Conference on Pattern Recognition (Cat. No.98EX170).

[18]  K. J. Vicente,et al.  Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer-Based Work , 1999 .

[19]  Tracy Anne Hammond,et al.  Recognizing sketched multistroke primitives , 2011, ACM Trans. Interact. Intell. Syst..

[20]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[21]  Marcus Liwicki,et al.  Touch & Write: Penabled Collaborative Intelligence , 2011, KTW.

[22]  Marcus Liwicki,et al.  Seamless Integration of Handwriting Recognition into Pen-Enabled Displays for Fast User Interaction , 2012, 2012 10th IAPR International Workshop on Document Analysis Systems.