Evoking Information in Probability Assessment: Knowledge Maps and Reasoning-Based Directed Questions

To assess probabilities in decision analysis, and for decision making in general, decision makers must evoke and apply relevant information. Decision analysts have developed a variety of structuring tools to aid decision makers in these tasks, including influence diagrams and knowledge maps. However, despite their pervasive use in practice, there have been no reported empirical tests of these tools. One goal of the present research was to provide an empirical test of the evocative knowledge map methodology. Second, a theoretical analysis of probability assessment was used to develop a new prescriptive elicitation technique. This technique uses a theoretically-grounded set of directed questions to help decision makers evoke information for probability assessment. Experimental results showed that both the knowledge map and the new directed questions methodology elicited a higher quantity and quality of information from decision makers engaged in probability assessment tasks than did a control condition. Further, the information elicited by the two techniques was qualitatively different, suggesting that the two methods might profitably be used as complementary elicitation techniques.

[1]  Russell S. Winer,et al.  Locally Rational Decision Making: The Distracting Effect of Information on Managerial Performance , 1992 .

[2]  D. Kuhn THE SKILLS OF ARGUMENT , 2008, Education for Thinking.

[3]  Ronald A. Howard,et al.  Knowledge Maps , 1989 .

[4]  Robert S. Wyer,et al.  The Role of Information Retrieval and Conditional Inference Processes in Belief Formation and Change , 1980 .

[5]  A. Tversky,et al.  The simulation heuristic , 1982 .

[6]  B. Fischhoff,et al.  Reasons for confidence. , 1980 .

[7]  Gerald F. Smith,et al.  Belief assessment: an underdeveloped phase of probability elicitation , 1995 .

[8]  G. Brier VERIFICATION OF FORECASTS EXPRESSED IN TERMS OF PROBABILITY , 1950 .

[9]  Angela M. O'Donnell Searching for Information in Knowledge Maps and Texts , 1993 .

[10]  S. Toulmin The uses of argument , 1960 .

[11]  Eric J. Johnson,et al.  Behavioral decision research: A constructive processing perspective. , 1992 .

[12]  Baruch Fischhoff,et al.  Focusing Techniques: A Shortcut to Improving Probability Judgments? , 1984 .

[13]  Douglas W. Ehninger,et al.  Toulmin on argument: An interpretation and application , 1960 .

[14]  Glenn Shafer,et al.  Languages and Designs for Probability Judgment , 1985, Cogn. Sci..

[15]  James Shanteau,et al.  Reducing the influence of irrelevant information on experienced decision makers , 1984 .

[16]  Samuel Holtzman,et al.  Intelligent decision systems , 1988 .

[17]  Gordon Johnson,et al.  Decision Analysis for the Professional with Supertree , 1988 .

[18]  D. Krantz,et al.  The use of statistical heuristics in everyday inductive reasoning , 1983 .

[19]  Rachel Reichman-Adar,et al.  Extended Person-Machine Interface , 1984, Artif. Intell..

[20]  D. M. Grether,et al.  The Irrelevance of Information Overload: An Analysis of Search and Disclosure , 1986 .

[21]  D. Krantz,et al.  The effects of statistical training on thinking about everyday problems , 1986, Cognitive Psychology.

[22]  Nancy G. Dodd,et al.  The use of decomposition in probability assessments of continuous variables , 1993 .

[23]  Glenn J. Browne,et al.  Arguments in the practical reasoning underlying constructed probability responses , 1995 .

[24]  L. Toothaker Multiple Comparisons for Researchers , 1991 .

[25]  Shawn P. Curley,et al.  Belief, knowledge, and uncertainty: A cognitive perspective on subjective probability , 1991 .

[26]  W. Edwards,et al.  Decision Analysis and Behavioral Research , 1986 .

[27]  P. George Benson,et al.  The effects of feedback and training on the performance of probability forecasters , 1992 .