Discussion Panel

Cognitive Task Analysis (CTA) has become part of the standard tool set of cognitive engineering. CTAs are routinely used to understand the cognitive and collaborative demands that contribute to performance problems, the basis of expertise, as well as the opportunities to improve performance through new forms of training, user interfaces, or decision aids. While the need to conduct CTAs has become well established, there is little in the way of available guidance with respect to ‘best practice’ for how to conduct a CTA or how to evaluate the quality of a CTA that has been conducted by others. This is an important gap as the range of consumers of CTAs is expanding to include program managers and regulators who may need to make decisions based on CTA findings. This panel brings together some of the leaders in development and application of CTA methods to address the question: Given the variety of methods available, and the lack of rigid guidance on how to perform a CTA, how does one judge the quality of a CTA?” The goal of the panel is to explore points of consensus with respect to ‘best practice’ in conducting and evaluating a CTA, in spite of differences in particular CTA methods, as well as to draw insights from unique and provocative perspectives.

[1]  Michael J. McCloskey,et al.  Envisioning Desirements , 2013, IEEE Intelligent Systems.

[2]  Gary Klein Seeing What Others Don't: The Remarkable Ways We Gain Insights , 2013 .

[3]  John D. Lee,et al.  The Oxford Handbook of Cognitive Engineering , 2013 .

[4]  Robert R. Hoffman,et al.  Perspectives on Cognitive Task Analysis: Historical Origins and Modern Communities of Practice , 2008 .

[5]  Emilie M. Roth,et al.  Uncovering the Requirements of Cognitive Work , 2008, Hum. Factors.

[6]  Ann M. Bisantz,et al.  Analysis of Cognitive Work , 2007 .

[7]  Emily S. Patterson,et al.  Judging Sufficiency: How Professional Intelligence Analysts Assess Analytical Rigor , 2007 .

[8]  Gwendolyn E. Campbell,et al.  HBR Validation: Integrating Lessons Learned From Multiple Academic Disciplines, Applied Communities, and the AMBR Project , 2006 .

[9]  Gary Klein,et al.  Working Minds: A Practitioner's Guide to Cognitive Task Analysis , 2006 .

[10]  J. Noyes,et al.  Empirical methods: Experiments , 2006 .

[11]  C. Burns,et al.  Lessons From a Comparison of Work Domain Models: Representational Choices and Their Implications , 2004, Hum. Factors.

[12]  Philip Koopman,et al.  Work-arounds, Make-work, and Kludges , 2003, IEEE Intell. Syst..

[13]  Ann M. Bisantz,et al.  Integrating cognitive analyses in a large-scale system design process , 2003, Int. J. Hum. Comput. Stud..

[14]  Gary Klein,et al.  4. Some guidelines for conducting a cognitive task analysis , 2001 .

[15]  R. Hutton,et al.  Applied cognitive task analysis (ACTA): a practitioner's toolkit for understanding cognitive task demands. , 1998, Ergonomics.

[16]  Roberta Calderwood,et al.  Critical decision method for eliciting knowledge , 1989, IEEE Trans. Syst. Man Cybern..

[17]  Kenneth R. Hammond Generalization in Operational Contexts: What Does It Mean? Can It be Done? , 1986, IEEE Transactions on Systems, Man, and Cybernetics.

[18]  E. Guba,et al.  Naturalistic inquiry: Beverly Hills, CA: Sage Publications, 1985, 416 pp., $25.00 (Cloth) , 1985 .