Measurement for evaluating the learnability and resilience of methods of cognitive work

Some experiments on human–computer interaction are aimed at evaluating hypotheses concerning cognitive work. Other experiments are intended to evaluate the software tools that shape the cognitive work. In both cases, effective experimentation is premised on the control and factorial analysis of sources of variability. This entails programmes of experimentation. However, sociotechnical systems are generally a ‘moving target’ in terms of the pace of change. The objective of this study was to create a general approach to experimental design and the measurement of cognitive work that can satisfy the requirements for experimentation and yet can also provide a ‘fast track’ to the evaluation of software-supported cognitive work. A measure called i-bar is presented, which is the inverse of the mid-range. The statistic is derived from data on trials-to-criterion in tasks that require practice and learning. This single measure is interpreted as a conjoint measurement scale, permitting: (a) evaluation of sensitivity of the principal performance measure (which is used to set the metric for trials to criterion); (b) evaluation of the learnability of the work method (i.e. the goodness of the software tool); (c) evaluation of the resilience of the work method. It is shown that it is possible to mathematically model such order statistics and derive methods for estimating likelihoods. This involves novel ways of thinking about statistical analysis for discrete non-Gaussian distributions. The idea and method presented herein should be applicable to the study of the effects of any training or intervention, including software interventions designed to improve legacy work methods and interventions that involve creating entirely new cognitive work systems.

[1]  E. Swift,et al.  The psychology of skill, with special reference to its acquisition in typewriting. , 1909 .

[2]  Stephen R. Rockwell,et al.  COGEVAL: Applying Cognitive Theories to Evaluate Conceptual Models , 2005 .

[3]  E. N. Corlett,et al.  Evaluation of Human Work , 2005 .

[4]  W. Chase,et al.  Visual information processing. , 1974 .

[5]  Jakob Nielsen,et al.  Usability engineering , 1997, The Computer Science and Engineering Handbook.

[6]  Keng Siau,et al.  Advanced Topics In Database Research , 2005 .

[7]  Robert R. Hoffman,et al.  Expertise out of context : proceedings of the sixth International Conference on Naturalistic Decision Making , 2007 .

[8]  Rivka R. Eifermann,et al.  Studies in psychology , 1965 .

[9]  Ted Megaw,et al.  The definition and measurement of mental workload , 2005 .

[10]  Christopher K. McClernon Stress Effects on Transfer from Virtual Environment Flight Training to Stressful Flight Environments , 2009 .

[11]  K. J. Vicente,et al.  Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer-Based Work , 1999 .

[12]  Jonathan Grudin,et al.  Utility and Usability: Research Issues and Development Contexts , 1992, Interact. Comput..

[13]  J. Jenkins Remember that old theory of memory? Well, forget it. , 1974 .

[14]  Robert R. Hoffman,et al.  HCC Implications for the Procurement Process , 2006, IEEE Intelligent Systems.

[15]  W. Firestone Alternative Arguments for Generalizing From Data as Applied to Qualitative Research , 1993 .

[16]  Michael O’Neill Measuring Workplace Performance , 2006 .

[17]  Robert R. Hoffman,et al.  The Procurement Woes Revisited , 2008, IEEE Intelligent Systems.

[18]  Robert R. Hoffman,et al.  Toward a Theory of Complex and Cognitive Systems , 2005, IEEE Intell. Syst..

[19]  David Woods,et al.  Commentary Designs are hypotheses about how artifacts shape cognition and collaboration , 1998 .

[20]  Philip Koopman,et al.  Work-arounds, Make-work, and Kludges , 2003, IEEE Intell. Syst..

[21]  H. O. Kiess,et al.  Psychological research methods: A conceptual approach , 1985 .

[22]  K. Newell,et al.  Forgotten moments: a note on skewness and kurtosis as influential factors in inferences extrapolated from response distributions. , 1984, Journal of motor behavior.

[23]  Robert L. Wears,et al.  Resilience Engineering: Concepts and Precepts , 2006, Quality and Safety in Health Care.

[24]  Mary Beth Rosson,et al.  Usability Engineering in Practice , 2002 .

[25]  L. Cronbach Beyond the Two Disciplines of Scientific Psychology. , 1975 .

[26]  Robert R. Hoffman,et al.  Metrics, Metrics, Metrics: Negative Hedonicity , 2008, IEEE Intelligent Systems.

[27]  Per Capita,et al.  About the authors , 1995, Machine Vision and Applications.

[28]  Joseph A. GoguenDepartment Towards a Social, Ethical Theory of Information 1 , 1997 .

[29]  Jakob Nielsen,et al.  Chapter 4 – The Usability Engineering Lifecycle , 1993 .

[30]  K L Some other books of interest: one hundred years of psychological research in america. , 1986, Science.

[31]  Robert R. Hoffman,et al.  Influencing versus Informing Design, Part 1: A Gap Analysis , 2008, IEEE Intelligent Systems.

[32]  D. A. Kenny,et al.  The moderator-mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. , 1986, Journal of personality and social psychology.

[33]  Ben Shneiderman,et al.  Designing The User Interface , 2013 .

[34]  Andrew Shepherd,et al.  Hierarchical task analysis , 2001 .

[35]  Gary Klein,et al.  Antipatterns in the Creation of Intelligent Systems , 2007, IEEE Intelligent Systems.

[36]  Robert R. Hoffman Influencing versus Informing Design, Part 2: Macrocognitive Modeling , 2008, IEEE Intelligent Systems.

[37]  Peter A. Hancock,et al.  Hedonomics: The Power of Positive and Pleasurable Ergonomics , 2005 .

[38]  B. Shneiderman Designing the User Interface (3rd Ed.) , 1998 .

[39]  A. Newell You can't play 20 questions with nature and win : projective comments on the papers of this symposium , 1973 .

[40]  Mica R. Endsley,et al.  Measurement of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[41]  J. Bradshaw,et al.  Institute for Human and Machine Cognition , 2022 .