Human Error Recontextualised

The design of interactive systems for use in complex dynamic domains often leads to new technology that is introduced into evolving and ongoing operational practice. It could be said that in this new technology system builders design primarily for data availability, not necessarily operability. Operability is taken to be the responsibility of the operator: it is (s)he who must find the right data at the right time. On the basis of their knowledge of the domain and the practitioner population, designers make (and have to make) assumptions about how a human is going to perform in relation to their system; how they are going to find the right data at the right time and act correctly in response. Such assumptions are sometimes more and sometimes less well-informed. For example, a designer may assume that the practitioner will know where to look in the display layout, given a certain system state, or that training will take care of a particular knowledge requirement. Once systems are fielded, however, it often turns out that some of these operability assumptions are unwarranted. Practitioners may in fact not know where to look, or they may not call the right piece of knowledge to mind in that context. The problem is that current human error assessment techniques do not systematically support a designer in evaluating the validity of assumptions about human performance. This problem is perhaps illustrated by the huge gulf between the terms used to describe human performance issues before, during and after a design is fielded. For instance, in the design phase many human error techniques may analyse task descriptions for “errors of omission”. Yet the practitioner caught up in an evolving problem situation may indicate that “he couldn’t keep track of what the system was doing”. Similarly, analysts of mishaps post-hoc will rarely use the terms of predictive techniques and speak instead about such things as “a lack of situation awareness”. What these differences in terminology point to is that current human error evaluation techniques may fall short in how they capture the operational context and the practitioner’s cognition that lies behind the creation of interaction failures. We look at these shortfalls in turn below.

[1]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[2]  Lucy Suchman Plans and situated actions: the problem of human-machine communication , 1987 .

[3]  David D. Woods,et al.  Modeling and predicting human error , 1989 .

[4]  Susanne Bødker,et al.  Through the Interface: A Human Activity Approach To User Interface Design , 1990 .

[5]  David Woods,et al.  Risk and human performance: Measuring the potential for disaster , 1990 .

[6]  H. Johnson,et al.  Task knowledge structures: Psychological basis and integration into system design ☆ , 1991 .

[7]  M. Kyng,et al.  Introduction: Situated Design , 1992 .

[8]  David Woods,et al.  Behind human error : cognitive systems, computers, and hindsight : state-of-the-art report , 1994 .

[9]  David Chenho Kung,et al.  Formal approach to scenario analysis , 1994, IEEE Software.

[10]  Erik Hollnagel,et al.  Human Reliability Analysis: Context and Control , 1994 .

[11]  Christine A . Halverson,et al.  Distributed Cognition as a Theoretical Framework for HCI: Don't Throw the Baby Out With the Bathwater — The Importance of the Cursor in Air Traffic Control , 1994 .

[12]  Kari Kuutti,et al.  Work processes: scenarios as a preliminary vocabulary , 1995 .

[13]  Morten Kyng,et al.  Creating contexts for design , 1995 .

[14]  E. Hutchins Cognition in the wild , 1995 .

[15]  T.M. Duffy,et al.  Scenario-Based Design: Envisioning Work and Technology in System Development [Book Review] , 1996, IEEE Transactions on Professional Communication.

[16]  Bob Fields,et al.  Distributed information resources: a new approach to interaction modelling , 1996 .

[17]  S. Dekker The complexity of management by exception : investigating cognitive demands and practitioner coping strategies in an envisioned air traffic world / , 1996 .

[18]  Bob Fields,et al.  Objectives, strategies and resources as design drivers , 1997, INTERACT.