Including Don't know answer options in brand image surveys improves data quality

How do respondents use the Don't know answer option in surveys? We investigate this question in the context of brand image measurement, using an experimental design with about 2,000 respondents and, for the first time, considering a range of commonly used answer formats. Results indicate that Don't know options are primarily used when respondents genuinely cannot answer the question, as opposed to representing a quick, low-effort option to complete a survey. Two practical conclusions arise from this study: (1) a Don't know option should be offered in cases where it is expected that some respondents may be unfamiliar with some brands under study; and (2) answer formats without a midpoint should be used in brand image studies because midpoints can either be falsely misinterpreted as an alternative to ticking the Don't know option, or used as an avenue for respondent satisficing.

[1]  Scott M. Smith,et al.  Visiting Item Non-responses in Internet Survey Data Collection , 2011 .

[2]  D. Alwin,et al.  No-Opinion Filters and Attitude Measurement Reliability , 1993 .

[3]  John R. Rossiter,et al.  The low stability of brand-attribute associations is partly due to market research methodology , 2008 .

[4]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[5]  R. Mason,et al.  Characteristics of Nonopinion and No Opinion Response Groups , 1978 .

[6]  Petra Lietz,et al.  Research into Questionnaire Design: A Summary of the Literature , 2010 .

[7]  Cam Rungie,et al.  Measuring and modeling the (limited) reliability of free choice attitude questions , 2005 .

[8]  John R. Rossiter,et al.  Measurement for the Social Sciences: The C-OAR-SE Method and Why It Must Replace Psychometrics , 2010 .

[9]  Mikael Gilljam,et al.  SHOULD WE TAKE DON'T KNOW FOR AN ANSWER? , 1993 .

[10]  Maria Elena Sanchez,et al.  PROBING “DONT KNOW” ANSWERS: EFFECTS ON SURVEY ESTIMATES AND VARIABLE RELATIONSHIPS , 1992 .

[11]  S. Presser,et al.  Questions and Answers in Attitude Surveys: Experiments on Question Form, Wording, and Context , 1996 .

[12]  Joseph K. McLaughlin,et al.  “DON'T KNOW” BOXES IN FACTUAL QUESTIONS IN A MAIL QUESTIONNAIRE EFFECTS ON LEVEL AND QUALITY OF RESPONSE , 1988 .

[13]  Del I. Hawkins,et al.  Uninformed Response Error in Survey Research , 1981 .

[14]  William N. McPhee,et al.  Formal theories of mass behavior , 1964 .

[15]  J. Krosnick,et al.  Survey research. , 1999, Annual review of psychology.

[16]  M. Traugott,et al.  Election polls, the news media, and democracy , 2000 .

[17]  Hans-Jürgen Hippler,et al.  ‘NO OPINION’-FILTERS: A COGNITIVE PERSPECTIVE , 1989 .

[18]  S. Presser,et al.  Survey Questions: Handcrafting the Standardized Questionnaire. , 1988 .

[19]  Ronald B. Rapoport Sex Differences in Attitude Expression: A Generational Explanation , 1982 .

[20]  Don't know responses in surveys: Analyses and interpretational consequences , 1988 .

[21]  J. Krosnick Response strategies for coping with the cognitive demands of attitude measures in surveys , 1991 .