Sensitive Topics in PC Web and Mobile Web Surveys: Is There a Difference?

A large number of findings in survey research suggest that misreporting in sensitive questions is situational and can vary in relation to context. The methodological literature demonstrates that social desirability biases are less prevalent in self-administered surveys, particularly in Web surveys, when there is no interviewer and less risk of presenting oneself in an unfavorable light. Since there is a growing number of users of mobile Web browsers, we focused our study on the effects of different devices (PC or cell phone) in Web surveys on the respondents’ willingness to report sensitive information. To reduce selection bias, we carried out a two-wave cross-over experiment using a volunteer online access-panel in Russia. Participants were asked to complete the questionnaire in both survey modes: PC and mobile Web survey. We hypothesized that features of mobile Web usage may affect response accuracy and lead to more socially desirable responses compared to the PC Web survey mode. We found significant differences in the reporting of alcohol consumption by mode, consistent with our hypothesis. But other sensitive questions did not show similar effects. We also found that the presence of familiar bystanders had an impact on the responses, while the presence of strangers did not have any significant effect in either survey mode. Contrary to expectations, we did not find evidence of a positive impact of completing the questionnaire at home and trust in data confidentiality on the level of reporting. These results could help survey practitioners to design and improve data quality in Web surveys completed on different devices.

[1]  Laura Kann,et al.  Comparison of Paper-and-Pencil Versus Web Administration of the Youth Risk Behavior Survey (YRBS): Risk Behavior Prevalence Estimates , 2010, Evaluation review.

[2]  Fritz Drasgow,et al.  A Meta-Analytic Study of Social Desirability Distortion in Computer- Administered Questionnaires, Traditional Questionnaires, and Interviews , 1999 .

[3]  Anne Kaikkonen Mobile Internet: Past, Present, and the Future , 2009, Int. J. Mob. Hum. Comput. Interact..

[4]  R. LaRose,et al.  Privacy Issues in Internet Surveys , 1999 .

[5]  Abigail Sellen,et al.  The Future of the Mobile Internet: Lessons from Looking at Web Use , 2002 .

[6]  M. Couper A REVIEW OF ISSUES AND APPROACHES , 2000 .

[7]  Roger Tourangeau,et al.  Sensitive Topics and Reluctant Respondents Demonstrating a Link between Nonresponse Bias and Measurement Error , 2010 .

[8]  W. Aquilino,et al.  A Comparison of Computer-Assisted and Paper-and-Pencil Self-Administered Questionnaires in a Survey on Smoking, Alcohol, and Drug Use , 1998 .

[9]  Seymour Sudman,et al.  Examining Substance Abuse Data Collection Methodologies , 2001 .

[10]  Roger Tourangeau,et al.  Taking the Audio Out of Audio-CASI , 2009 .

[11]  S. Presser,et al.  Data collection mode and social desirability bias in self-reported religious attendance : Church attendance in the United States , 1998 .

[12]  Laura Kann,et al.  An assessment of the effect of data collection setting on the prevalence of health risk behaviors among adolescents. , 2002, The Journal of adolescent health : official publication of the Society for Adolescent Medicine.

[13]  Marek Fuchs,et al.  Prevalence of Cell Phone Sharing , 2013 .

[14]  L. Stinson,et al.  Income Measurement Error in Surveys: A Review , 2000 .

[15]  Seymour Sudman,et al.  Question Threat and Response Bias , 1978 .

[16]  Andy Peytchev,et al.  Comparison of Cell Phone and Landline Surveys: A Design Perspective , 2010 .

[17]  S. McCabe Comparison of Web and Mail Surveys in Collecting Illicit Drug Use Data: A Randomized Experiment , 2004, Journal of drug education.

[18]  L. Saxe,et al.  A comparison of paper vs computer-assisted self interview for school alcohol, tobacco, and other drug surveys , 2000 .

[19]  R. Tourangeau,et al.  Sensitive questions in surveys. , 2007, Psychological bulletin.

[20]  J. Krosnick,et al.  AN EVALUATION OF A COGNITIVE THEORY OF RESPONSE-ORDER EFFECTS IN SURVEY MEASUREMENT , 1987 .

[21]  Mick P. Couper,et al.  THE IMPACT OF PRIVACY AND CONFIDENTIALITY CONCERNS ON SURVEY PARTICIPATION THE CASE OF THE 1990 U.S. CENSUS , 1993 .

[22]  Tom W. Smith,et al.  ASKING SENSITIVE QUESTIONS THE IMPACT OF DATA COLLECTION MODE, QUESTION FORMAT, AND QUESTION CONTEXT , 1996 .

[23]  D. Paulhus Two-component models of socially desirable responding. , 1984 .

[24]  Volker Stocké Response Privacy and Elapsed Time Since Election Day as Determinants for Vote Overreporting , 2005 .

[25]  Tom W. Smith,et al.  Motivation to Report Sensitive Behaviors on Surveys: Evidence From a Bogus Pipeline Experiment1 , 1997 .

[26]  S. Presser CAN CHANGES IN CONTEXT REDUCE VOTE OVERPORTING IN SURVEYS , 1990 .

[27]  Peter G. M. van der Heijden,et al.  Meta-Analysis of Randomized Response Research , 2005 .

[28]  W. Aquilino,et al.  EFFECTS OF INTERVIEW MODE ON SELF-REPORTED DRUG USE , 1990 .

[29]  J. Darroch,et al.  Measuring the extent of abortion underreporting in the 1995 National Survey of Family Growth. , 1998, Family planning perspectives.

[30]  Estimates of Threatening Behavior Based on Reports of Friends , 1977 .

[31]  Hugh J. Parry,et al.  Validity of Responses to Survey Questions , 1950 .

[32]  Roger Tourangeau,et al.  Understanding the Effects of Audio-CASI on Self-Reports of Sensitive Behavior , 2003 .

[33]  Brady T. West,et al.  Linear Mixed Models: A Practical Guide Using Statistical Software , 2006 .

[34]  Jinwoo Kim,et al.  An Empirical Study of Use Contexts in the Mobile Internet, Focusing on the Usability of Information Architecture , 2005, Inf. Syst. Frontiers.

[35]  D. Marlowe,et al.  The Approval Motive: Studies in Evaluative Dependence , 1980 .

[36]  Isabelle N. Rhodes,et al.  Making the Randomized Response Technique Work , 1976 .

[37]  L. Rips,et al.  The Psychology of Survey Response , 2000 .

[38]  Virpi Roto,et al.  Web browsing on mobile phones : characteristics of user experience ; Internet-selailu kännykällä - käyttäjäkokemuksen piirteitä , 2009 .

[39]  Zannette A. Uriell,et al.  Sensitive topics: Are there modal differences? , 2009, Comput. Hum. Behav..

[40]  Wendy Visscher,et al.  The Item Count Technique as a Method of Indirect Questioning: A Review of Its Development and a Case Study Application , 2011 .

[41]  Seymour Sudman,et al.  Measurement errors in surveys , 1993 .

[42]  J. Fortenberry,et al.  Comparability of a computer-assisted versus written method for collecting health behavior information from adolescent patients. , 1999, The Journal of adolescent health : official publication of the Society for Adolescent Medicine.

[43]  Mario Callegaro,et al.  Computing Response Metrics for Online Panels , 2008 .

[44]  John Van Hoewyk,et al.  Attitudes and Behavior: The Impact of Privacy and Confidentiality Concerns on Participation in the 2000 Census , 2003 .

[45]  L. Kraus,et al.  Measuring alcohol consumption and alcohol-related problems: comparison of responses from self-administered questionnaires and telephone interviews. , 2001, Addiction.

[46]  Leysia Palen,et al.  Going wireless: behavior & practice of new mobile phone users , 2000, CSCW '00.

[47]  Andy Peytchev,et al.  Experiments in Mobile Web Survey Design , 2010 .