Analysing interactive devices based on information resource constraints

Analysis of the usability of an interactive system requires both an understanding of how the system is to be used and a means of assessing the system against that understanding. Such analytic assessments are particularly important in safety-critical systems as latent vulnerabilities may exist which have negative consequences only in certain circumstances. Many existing approaches to assessment use tasks or scenarios to provide explicit representation of their understanding of use. These normative user behaviours have the advantage that they clarify assumptions about how the system will be used but have the disadvantage that they may exclude many plausible deviations from these norms. Assessments of how a design fails to support these user behaviours can be a matter of judgement based on individual experience rather than evidence. We present a systematic formal method for analysing interactive systems that is based on constraints rather than prescribed behaviour. These constraints capture precise assumptions about what information resources are used to perform action. These resources may either reside in the system itself or be external to the system. The approach is applied to two different medical device designs, comparing two infusion pumps currently in common use in hospitals. Comparison of the two devices is based on these resource assumptions to assess consistency of interaction within the design of each device.

[1]  Ann Blandford,et al.  Verification-guided modelling of salience and cognitive load , 2008, Formal Aspects of Computing.

[2]  Michael D. Harrison,et al.  Resources for Situated Actions , 2008, DSV-IS.

[3]  Edmund M. Clarke,et al.  Model Checking , 1999, Handbook of Automated Reasoning.

[4]  George S. Avrunin,et al.  Patterns in property specifications for finite-state verification , 1999, Proceedings of the 1999 International Conference on Software Engineering (IEEE Cat. No.99CB37002).

[5]  Rachel K. E. Bellamy,et al.  Deploying CogTool: integrating quantitative usability assessment into real-world software development , 2011, 2011 33rd International Conference on Software Engineering (ICSE).

[6]  Insup Lee,et al.  Verification of interactive software for medical devices: PCA infusion pumps and FDA regulation as an example , 2013, EICS.

[7]  Barry Kirwan,et al.  A Guide To Task Analysis: The Task Analysis Working Group , 1992 .

[8]  Ann Blandford,et al.  Formal user models and methods for reasoning about interactive behaviour , 2007 .

[9]  Michael D. Harrison,et al.  Interaction engineering using the IVY tool , 2009, EICS '09.

[10]  Yi Zhang,et al.  Safety-assured development of the GPCA infusion pump software , 2011, 2011 Proceedings of the Ninth ACM International Conference on Embedded Software (EMSOFT).

[11]  Stefano Levialdi,et al.  Resource-based models of visual interaction: understanding errors , 2005, 2005 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC'05).

[12]  Michael D. Harrison,et al.  Model Checking Interactor Specifications , 2001, Automated Software Engineering.

[13]  Andrew F. Monk,et al.  Why Industry Doesn’t Use the Wonderful Notations We Researchers Have Given Them to Reason About Their Designs , 1994 .

[14]  Ann Blandford,et al.  DiCoT: A Methodology for Applying Distributed Cognition to the Design of Teamworking Systems , 2005, DSV-IS.

[15]  Gill Ginsburg,et al.  Human factors engineering: A tool for medical device evaluation in hospital procurement decision-making , 2005, J. Biomed. Informatics.

[16]  David E. Kieras,et al.  An Overview of the EPIC Architecture for Cognition and Performance With Application to Human-Computer Interaction , 1997, Hum. Comput. Interact..

[17]  Cathleen Wharton,et al.  Cognitive Walkthroughs: A Method for Theory-Based Evaluation of User Interfaces , 1992, Int. J. Man Mach. Stud..

[18]  Michael D. Harrison,et al.  Systematic Analysis of Control Panel Interfaces Using Formal Tools , 2008, DSV-IS.

[19]  K. J. Vicente,et al.  Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer-Based Work , 1999 .

[20]  Michael D. Harrison,et al.  Reusing models and properties in the analysis of similar interactive devices , 2013, Innovations in Systems and Software Engineering.

[21]  John Rushby,et al.  Using model checking to help discover mode confusions and other automation surprises , 2002, Reliab. Eng. Syst. Saf..

[22]  Bob Fields,et al.  ANALYSING HUMAN-COMPUTER INTERACTION AS DISTRIBUTED COGNITION: THE RESOURCES MODEL , 1999 .

[23]  Robert E. Fields,et al.  Analysis of erroneous actions in the design of critical systems , 2001 .

[24]  José Creissac Campos,et al.  Supporting Resource-Based Analysis of Task Information Needs , 2005, DSV-IS.

[25]  Michael D. Harrison,et al.  Using PVS to Investigate Incidents through the Lens of Distributed Cognition , 2012, NASA Formal Methods.

[26]  Allen Newell,et al.  SOAR: An Architecture for General Intelligence , 1987, Artif. Intell..

[27]  Karsten Loer,et al.  Model-based automated analysis for dependable interactive systems , 2003 .

[28]  Helena M. Mentis,et al.  HCI fieldwork in healthcare: creating a graduate guidebook , 2013, CHI Extended Abstracts.

[29]  David Garlan,et al.  Model Checking Publish-Subscribe Systems , 2003, SPIN.

[30]  Fabio Paternò,et al.  CTTE: Support for Developing and Analyzing Task Models for Interactive System Design , 2002, IEEE Trans. Software Eng..

[31]  Ellen J. Bass,et al.  Generating phenotypical erroneous human behavior to evaluate human-automation interaction using model checking , 2012, Int. J. Hum. Comput. Stud..

[32]  Ann Blandford,et al.  Supporting Field Investigators with PVS: A Case Study in the Healthcare Domain , 2012, SERENE.