Satisficing in Web Surveys: Implications for Data Quality and Strategies for Reduction.

With the increasing use of the Web in mixed mode surveys, especially those conducted by the Census and other federal statistical agencies, it has become more urgent than ever to develop methods to enhance online measurement quality. This dissertation research (which includes three studies) focuses on respondent satisficing as a source of online measurement errors, and an intervention approach to reduce satisficing behaviors. The first study evaluates speeding (or very fast responding) as an indicator by investigating how it is associated with another wellknown satisficing behavior – non-differentiation in grid questions. The second and third studies aim to extend the scope of previous research on Web survey interventions by investigating how the design of the intervention might affect its success in curtailing respondent satisficing. Specifically, the second study examines whether intervention for different satisficing behaviors could produce different impact on overall response quality. The third study explores whether intervention that feels like from a human (with the manipulation of image and text displayed in the intervention prompt) performs differently compared to the intervention that is obviously automatic computer feedback. The findings lead to better understanding of satisficing behaviors and the mechanism for interventions to reduce these behaviors.

[1]  C. Nass,et al.  Social-Psychological Origins of Feelings of Presence: Creating Social Presence With Machine-Generated Voices , 2005 .

[2]  Jun-yong Noh,et al.  Talking faces , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).

[3]  B. J. Fogg,et al.  Can computer personalities be human personalities? , 1995, Int. J. Hum. Comput. Stud..

[4]  Matthew Lombard,et al.  At the Heart of It All: The Concept of Presence , 2006 .

[5]  Ronald P. Carver,et al.  Reading Rate: Theory, Research, and Practical Implications. , 1992 .

[6]  B. J. Fogg,et al.  Can computers be teammates? , 1996, Int. J. Hum. Comput. Stud..

[7]  C. Nass,et al.  How “Real” Are Computer Personalities? , 1996 .

[8]  Melanie C. Green,et al.  Telephone versus Face-to-Face Interviewing of National Probability Samples with Long Questionnaires: Comparisons of Respondent Satisficing and Social Desirability Response Bias , 2003 .

[9]  C. Nass,et al.  Are People Polite to Computers? Responses to Computer-Based Interviewing Systems1 , 1999 .

[10]  Seymour Sudman,et al.  Cognition, aging, and self-reports , 1999 .

[11]  J. Krosnick Response strategies for coping with the cognitive demands of attitude measures in surveys , 1991 .

[12]  Peter V. Miller,et al.  Web Survey Methods Introduction , 2008 .

[13]  R. Tourangeau,et al.  Fast times and easy questions: the effects of age, experience and question complexity on web survey response times , 2008 .

[14]  Scott Fricker,et al.  Examining the Relationship Between Nonresponse Propensity and Data Quality in Two National Household Surveys , 2010 .

[15]  L. Rips,et al.  The Psychology of Survey Response , 2000 .

[16]  Mick P. Couper,et al.  Race in the Live and the Virtual Interview: Racial Deference, Social Desirability, and Activation Effects in Attitude Surveys , 2003 .

[17]  M. Couper A REVIEW OF ISSUES AND APPROACHES , 2000 .

[18]  Olena Kaminska,et al.  Satisficing Among Reluctant Respondents in a Cross-National Context , 2010 .

[19]  Lee Sproull,et al.  When the Interface Is a Face , 1996, Hum. Comput. Interact..

[20]  William G. Cochran,et al.  Experimental Designs, 2nd Edition , 1950 .

[21]  Leah Melani Christian,et al.  The Influence of Topic Interest and Interactive Probing on Responses to Open-Ended Questions in Web Surveys , 2009 .

[22]  Peter V. Miller,et al.  A Study of Experimental Techniques for Telephone Interviewing , 1982 .

[23]  Dirk Heerwegh,et al.  Explaining Response Latencies and Changing Answers Using Client-Side Paradata from a Web Survey , 2003 .

[24]  Neil Malhotra,et al.  Completion Time and Response Order Effects in Web Surveys , 2008 .

[25]  Marek Fuchs,et al.  Gender-of-Interviewer Effects in a Video-Enhanced Web Survey , 2009 .

[26]  Zsolt Ugray,et al.  Looking at human-computer interface design: Effects of ethnicity in computer agents , 2007, Interact. Comput..

[27]  Mick P. Couper,et al.  Designing a Strategy for Reducing “No Opinion” Responses in Web-Based Surveys , 2002 .

[28]  H. Simon,et al.  A Behavioral Model of Rational Choice , 1955 .

[29]  Roger Tourangeau,et al.  Humanizing self-administered surveys: experiments on social presence in web and IVR surveys , 2003, Comput. Hum. Behav..

[30]  C. Nass,et al.  Are Machines Gender Neutral? Gender‐Stereotypic Responses to Computers With Voices , 1997 .

[31]  Scott Fricker,et al.  An Experimental Comparison of Web and Telephone Surveys , 2005 .

[32]  James C. Lester,et al.  The Case for Social Agency in Computer-Based Teaching: Do Students Learn More Deeply When They Interact With Animated Pedagogical Agents? , 2001 .

[33]  H. Simon,et al.  Rational choice and the structure of the environment. , 1956, Psychological review.