Statistics Commentary Series: Commentary No. 23: A Plague of Decimals: Why Too Much Precision Can be Misleading.

A t first glance, there doesn’t appear to be anything wrong with the (fictitious) data in Table 1; they simply report the means, standard deviations (SDs), and correlations for 3 variables – age, education, and socioeconomic status – for 20 people in a study. Indeed, the table is similar to those found in many papers describing the demographic data of the participants and the relationships among the variables. At another level, though, all of the 9 numbers in the table are misleading, because they are too precise. This may sound oxymoronic; how can an estimate of a parameter be too precise? After all, much of what we are taught in courses on methodology is that our measurements should be as accurate as possible. In our research and clinical work, we look for scales that have high reliabilities because they provide a more precise estimate of what we are trying to measure than scales that have poor reliability; we average a number of measurements or items, because the error termswill tend to cancel out, leading to amore accurate estimate; and take other steps to maximize the precision of our measurements. Furthermore, 0.667 is a closer approximation of 2/3 than is 0.67, which in turn is closer than 0.7. So again we can ask the question, how can the numbers in the table too precise? The answer comes down to the issue of howmuch accuracy can be supported by the data, and this in turn depends on 2 factors: the way the data were gathered in the first place, and the number of observations contributing to the estimate. Let’s begin with the correlations, which are reported to 4 decimal places. The publication manual of the American Psychological Association, which is the “bible” for many journals in the social sciences, states that “As a general rule, fewer decimal digits are easier to comprehend than more digits; therefore, in general, it is better to round to two decimal places,” p. 113 a sentiment echoed by other guidelines. Notice, though, that the argument is phrased in terms of comprehension, not precision. I would argue that even reporting correlations to 2 decimal places is an example of “pseudoprecision” that cannot be justified in the vast majority of studies. We did a Monte Carlo simulation, using correlations of 0.15, 0.30, 0.50, and 0.70 and for each, generated random samples of 60, 100, 200, 500, 1,000, 10,000, and 100,000 values. What we were interested in was the reproducibility of the first, second, third, and fourth decimal place. We concluded that “even when n is less than 500, the habit of reporting a result to two decimal places seems unwarranted, and it never makes sense to report the third digit after the decimal place unless one has a sample size larger than 100,000.” p. 687 This should not be surprising. As Feinberg and Wainer pointed out in their delightful paper, in order for the third decimal place to be reproducible, the standard error (SE) needs to be less than 0.0005, and since the SE is 1/√n, then √n = 1/0.0005, or 2,000 and therefore n would have to be 4,000,000. A sample size greater than 400 is necessary for even the first decimal place to be reproducible. That is, for the vast majority of studies in psychopharmacology, where sample sizes over 100 are rare, even the second digit of a correlation is basically an irreproducible value, and the third and fourth digits do not represent increased precision, but rather more sampling error. The other numbers in the table, the means and SDs, adhere to the Publication Manual’s recommended 2 digits, but I would argue that these too are examples of an unwarranted degree of precision. The second decimal place for age represents 3.65 days (i.e., 1/100th of a year). If we determined the participants’ age by asking them how old they were at their last birthday, then we are measuring age with an average inaccuracy of 182 days. Given that degree of imprecision with how the underlying data were gathered to begin with, it is fatuous to imply that we know the average age of the participants to within 4 days of accuracy. The situation is even more extreme in the case of education. The school year is about 200 days long, so that the second decimal place implies that we know the length of time a person was sitting at a desk to within 2 days of accuracy. Note that this situation is different from, say, measuring the height of a person as 70 inches (or 178 cm for everyone outside the U.S., Liberia, and Myanmar). If we want to be more precise, we can use a more finely divided ruler and say 70.2 inches, or even 70.23 inches. Those extra decimal places are more precise estimates of the true value, because they