Human factors and folk models

This paper presents a discussion of the susceptibility of human factors to the use of folk models. The case of automation-induced complacency is used as a guiding example to illustrate how folk models (1) substitute one label for another rather than decomposing a large construct into more measurable specifics; (2) are immune to falsification and so resist the most important scientific quality check; and (3) easily get overgeneralised to situations they were never meant to speak about. We then discuss the link between models and measurements, where the model constrains what can be measured by describing what is essential performance, and where the model’s parameters become the basis for specifying the measurements. We propose that one way forward for human factors is to de-emphasize the focus on inferred and uncertain states of the mind, and shift to characteristics of human performance instead.

[1]  Alan F. Stokes,et al.  Flight Stress: Stress, Fatigue, and Performance in Aviation , 1994 .

[2]  F. Attneave Applications of information theory to psychology: A summary of basic concepts, methods, and results. , 1961 .

[3]  Stephen P. Stich,et al.  From folk psychology to cognitive science: The case against belief. , 1985 .

[4]  Erik Hollnagel,et al.  Human Reliability Analysis: Context and Control , 1994 .

[5]  D. Kahneman,et al.  Attention and Effort , 1973 .

[6]  M. Endsley Situation Awareness In Aviation Systems , 1999 .

[7]  Kim Sterelny From folk psychology to cognitive science , 1985 .

[8]  K. Weick FROM SENSEMAKING IN ORGANIZATIONS , 2021, The New Economic Sociology.

[9]  Erik Hollnagel,et al.  Cognitive Systems Engineering: New wine in new bottles , 1999, Int. J. Hum. Comput. Stud..

[10]  S. Stich From folk psychology to cognitive science , 1983 .

[11]  P. H. Lindsay,et al.  Human Information Processing: An Introduction to Psychology , 1972 .

[12]  P. H. Lindsay Human Information Processing , 1977 .

[13]  M. Posner Human information processing: An introduction to psychology. 2nd ed. , 1977 .

[14]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[15]  Paul Wilmott You can't have one without the other , 2003 .

[16]  Gary James Jason,et al.  The Logic of Scientific Discovery , 1988 .

[17]  R. D. Campbell,et al.  Human performance and limitations in aviation , 1991 .

[18]  R. Yerkes,et al.  The relation of strength of stimulus to rapidity of habit‐formation , 1908 .

[19]  K. Abbott,et al.  The interfaces between flightcrews and modern flight deck systems , 1996 .

[20]  David O'Hare,et al.  Flightdeck Performance: The Human Factor , 1990 .

[21]  Nadine B. Sarter,et al.  Team Play with a Powerful and Independent Agent: Operational Experiences and Automation Surprises on the Airbus A-320 , 1997, Hum. Factors.

[22]  M R Endsley SITUATION AWARENESS IN AVIATION SYSTEMS. IN: HANDBOOK OF AVIATION HUMAN FACTORS , 1999 .

[23]  Mica R. Endsley,et al.  Experimental Analysis and Measurement of Situation Awareness , 1995 .

[24]  Harold Morick Cartesian Privilege and the Strictly Mental , 1971 .