WHAT DO OPEN-ENDED QUESTIONS MEASURE?

Open-ended questions are frequently used by survey researchers to measure public opinion. Some scholars, however, have doubts about how accurately these kinds of questions measure the views of the public. A chief concern is that the questions tap, in part, people's ability to articulate a response, not their underlying attitudes. This paper tests whether this concern is warranted. Using open-ended questions from the Center for Political Studies, I show that almost all people respond to open-ended questions. The few individuals who do not respond appear uninterested in the specific question posed, not unable to answer such questions in general. These findings should increase our confidence in work of scholars who have relied on open-ended questions. Different methods have been used by survey researchers to measure the political attitudes of the American public. One commonly used method is the open-ended question, which allows individuals to respond to the query in their own words. Many scholars contend that by allowing citizens to respond freely to the inquiry, the question is better able to measure their salient concerns than the close-ended format that forces people to choose among a fixed set of responses (see, for instance, RePass, 1971; Kelley, 1983; Wattenberg, 1984).1 While there are advantages to the open-ended format, criticisms have also been made against it. Among these is the belief that some citizens fail to JOHN G. GEER teaches at Arizona State University. The author thanks Pat Kenney and Tom Rochon for their helpful comments on an earlier version of this paper. Data used in this article were made available by the Inter-university Consortium for Political and Social Research. The author bears sole responsibility for the analyses and interpretations