Developing an Analytical Inspection Criteria for Health IT Personnel with Minimum Training in Cognitive Ergonomics: A Practical Solution to Improving EHR Usability

EHR usability has been identified as a major barrier to care quality optimization. One major challenge of improving EHR usability is the lack of systematic training in usability or cognitive ergonomics for EHR designers/developers in the vendor community and EHR analysts making significant configurations in healthcare organizations. A practical solution is to provide usability inspection tools that can be easily operationalized by EHR analysts. This project is aimed at developing a set of usability tools with demonstrated validity and reliability. We present a preliminary study of a metric for cognitive transparency and an exploratory experiment testing its validity in predicting the effectiveness of action-effect mapping. Despite the pilot nature of both, we found high sensitivity and specificity of the metric and higher response accuracy within a shorter time for users to determine action-effect mappings in transparent user interface controls. We plan to expand the sample size in our empirical study.

[1]  Jiajie Zhang,et al.  Representations in Distributed Cognitive Tasks , 1994, Cogn. Sci..

[2]  Jean-Michel Hoc,et al.  Towards ecological validity of research in cognitive ergonomics , 2001 .

[3]  Bob Fields,et al.  Designing Human-System Interaction Using The Resource Model , 2007 .

[4]  Wanda J. Orlikowski,et al.  Technology and Institutions: What Can Research on Information Technology and Research on Organizations Learn from Each Other? , 2001, MIS Q..

[5]  Charlotte A. Weaver,et al.  Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA. , 2013, Journal of the American Medical Informatics Association : JAMIA.

[6]  Ann Blandford,et al.  Scoping Analytical Usability Evaluation Methods: A Case Study , 2008, Hum. Comput. Interact..

[7]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[8]  Cathleen Wharton,et al.  The cognitive walkthrough method: a practitioner's guide , 1994 .

[9]  Jiajie Zhang,et al.  TURF: Toward a unified framework of EHR usability , 2011, J. Biomed. Informatics.

[10]  Wayne D. Gray,et al.  Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods , 1998, Hum. Comput. Interact..

[11]  Morten Hertzum,et al.  The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods , 2001, Int. J. Hum. Comput. Interact..

[12]  Bob Fields,et al.  ANALYSING HUMAN-COMPUTER INTERACTION AS DISTRIBUTED COGNITION: THE RESOURCES MODEL , 1999 .

[13]  Allen Newell,et al.  Computer text-editing: An information-processing analysis of a routine cognitive skill , 1980, Cognitive Psychology.

[14]  Herbert S. Lin,et al.  Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions , 2009 .

[15]  Lynn Y. Arnaut 2 – Human Factors Considerations in the Design and Selection of Computer Input Devices , 1988 .

[16]  Jakob Nielsen,et al.  Usability inspection methods , 1994, CHI 95 Conference Companion.

[17]  Christophe Kolski,et al.  State of the Art on the Cognitive Walkthrough Method, Its Variants and Evolutions , 2010, Int. J. Hum. Comput. Interact..

[18]  Vimla L. Patel,et al.  Distributed cognition, representation, and affordance , 2006 .

[19]  Susan Wiedenbeck,et al.  The use of icons and labels in an end user application program: An empirical study of learning and retention , 1999, Behav. Inf. Technol..

[20]  Robert C. Williges,et al.  Criteria For Evaluating Usability Evaluation Methods , 2001, Int. J. Hum. Comput. Interact..