Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc.) can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.
[1]
Ruth Beyth-Marom,et al.
How probable is probable? A numerical translation of verbal probability expressions
,
1982
.
[2]
J. Ord,et al.
Principles of forecasting: A handbook for researchers and practitioners
,
2002
.
[3]
Dennis T. Kennedy,et al.
An Applied Study Using the Analytic Hierarchy Process to Translate Common Verbal Phrases to Numerical Probabilities
,
1997
.
[4]
Lawrence D Cohn,et al.
Quantifying risk: verbal probability expressions in Spanish and English.
,
2009,
American journal of health behavior.
[5]
A. Diederich,et al.
Evaluating and Combining Subjective Probability Estimates
,
1997
.
[6]
J. Scott Armstrong,et al.
Principles of forecasting : a handbook for researchers and practitioners
,
2001
.
[7]
Rami Zwick,et al.
Comparing the calibration and coherence of numerical and verbal probability judgments
,
1993
.
[8]
Ilan Yaniv,et al.
Measures of Discrimination Skill in Probabilistic Judgment
,
1991
.