The Answer Bot Effect (ABE): A powerful new form of influence made possible by intelligent personal assistants and search engines

We introduce and quantify a relatively new form of influence: the Answer Bot Effect (ABE). In a 2015 report in PNAS, researchers demonstrated the power that biased search results have to shift opinions and voting preferences without people’s knowledge–by up to 80% in some demographic groups. They labeled this phenomenon the Search Engine Manipulation Effect (SEME), speculating that its power derives from the high level of trust people have in algorithmically-generated content. We now describe three experiments with a total of 1,736 US participants conducted to determine to what extent giving users “the answer”–either via an answer box at the top of a page of search results or via a vocal reply to a question posed to an intelligent personal assistant (IPA)–might also impact opinions and votes. Participants were first given basic information about two candidates running for prime minister of Australia (this, in order to assure that participants were “undecided”), then asked questions about their voting preferences, then given answers to questions they posed about the candidates–either with answer boxes or with vocal answers on an Alexa simulator–and then asked again about their voting preferences. The experiments were controlled, randomized, double-blind, and counterbalanced. Experiments 1 and 2 demonstrated that answer boxes can shift voting preferences by as much as 38.6% and that the appearance of an answer box can reduce search times and clicks on search results. Experiment 3 demonstrated that even a single question-and-answer interaction on an IPA can shift voting preferences by more than 40%. Multiple questions posed to an IPA leading to answers that all have the same bias can shift voting preferences by more than 65%. Simple masking procedures still produced large opinion shifts while reducing awareness of bias to close to zero. ABE poses a serious threat to both democracy and human autonomy because (a) it produces large shifts in opinions and voting preferences with little or no user awareness, (b) it is an ephemeral form of influence that leaves no paper trail, and (c) worldwide, it is controlled almost exclusively by just four American tech companies. ABE will become a greater threat as people increasingly rely on IPAs for answers.

[1]  R. Epstein,et al.  The surprising power of a click requirement: How click requirements and warnings affect users’ willingness to disclose personal information , 2022, PloS one.

[2]  Grace W. Murray Who is more trustworthy, Alexa or mom?: Children’s selective trust in a digital age. , 2021, Technology, Mind, and Behavior.

[3]  Yan Fossat,et al.  Medication Name Comprehension of Intelligent Virtual Assistants: A Comparison of Amazon Alexa, Google Assistant, and Apple Siri Between 2019 and 2021 , 2021, Frontiers in Digital Health.

[4]  H. Matute,et al.  The influence of algorithms on political and dating decisions , 2021, PloS one.

[5]  R. Watson,et al.  Humans rely more on algorithms than social influence as a task becomes more difficult , 2021, Scientific Reports.

[6]  H. Brockmann,et al.  A class for itself? On the worldviews of the new tech elite , 2021, PloS one.

[7]  Ahmad S. Haider,et al.  Google Autocomplete Search Algorithms and the Arabs' Perspectives on Gender: A Case Study of Google Egypt , 2020, GEMA Online® Journal of Language Studies.

[8]  Dan Wu,et al.  Credibility assessment of good abandonment results in mobile search , 2020, Inf. Process. Manag..

[9]  John J. Howard,et al.  Human-algorithm teaming in face recognition: How algorithm outcomes cognitively bias human decision-making , 2020, PloS one.

[10]  Stefan M. Herzog,et al.  Public attitudes towards algorithmic personalization and use of personal data online: evidence from Germany, Great Britain, and the United States , 2020, Humanities and Social Sciences Communications.

[11]  Franziska Pradel Biased Representation of Politicians in Google and Wikipedia Search? The Joint Effect of Party Identity, Gender Identity and Elections , 2020 .

[12]  Olfa Nasraoui,et al.  Evolution and impact of bias in human and machine learning algorithm interaction , 2020, PloS one.

[13]  Jianshan Sun,et al.  Modeling undecided voters to forecast elections: From bandwagon behavior and the spiral of silence perspective , 2020 .

[14]  Eric R. Walsh-Buhi,et al.  Evaluating Smart Assistant Responses for Accuracy and Misinformation Regarding Human Papillomavirus Vaccination: Content Analysis Study , 2020, Journal of medical Internet research.

[15]  Daniel Burton Shank,et al.  Exposed by AIs! People Personally Witness Artificial Intelligence Exposing Personal Information and Exposing People to Undesirable Content , 2020, Int. J. Hum. Comput. Interact..

[16]  Adam S. Miner,et al.  Chatbots in the fight against the COVID-19 pandemic , 2020, npj Digital Medicine.

[17]  A. Scull Dr. Google Will See You Now: Google’s Health Information Previews and Implications for Consumer Health , 2020, Medical reference services quarterly.

[18]  J. Ayers,et al.  Responses to addiction help-seeking from Alexa, Siri, Google Assistant, Cortana, and Bixby intelligent virtual assistants , 2020, npj Digital Medicine.

[19]  V. Steeves A dialogic analysis of Hello Barbie’s conversations with children , 2020, Big Data Soc..

[20]  G. Boca,et al.  The Effect of Social Presence and Chatbot Errors on Trust , 2019, Sustainability.

[21]  L. Ayalon,et al.  Age and Gender Stereotypes Reflected in Google's "Autocomplete" Function: The Portrayal and Possible Spread of Societal Stereotypes. , 2019, The Gerontologist.

[22]  F. Arendt,et al.  Googling for Trump: investigating online information seeking during the 2016 US presidential election , 2019 .

[23]  Iyad Rahwan,et al.  Behavioural evidence for a transparency–efficiency tradeoff in human–machine cooperation , 2019, Nature Machine Intelligence.

[24]  Brian W. Powers,et al.  Dissecting racial bias in an algorithm used to manage the health of populations , 2019, Science.

[25]  Anne Marie Piper,et al.  Hey Google, Do Unicorns Exist?: Conversational Agents as a Path to Answers to Children's Questions , 2019, IDC.

[26]  Nicholas Diakopoulos,et al.  Search as News Curator: The Role of Google in Shaping Attention to News Information , 2019, CHI.

[27]  Kinga Polynczuk-Alenius,et al.  Algorithms of oppression: how search engines reinforce racism , 2019, Information, Communication & Society.

[28]  Caterina Suitner,et al.  Viral suspicions: Vaccine hesitancy in the Web 2.0. , 2019, Journal of experimental psychology. Applied.

[29]  Gregory Zobel Review of "Algorithms of oppression: how search engines reinforce racism," by Noble, S. U. (2018). New York, New York: NYU Press. , 2019, CDQR.

[30]  Shoshana Zuboff The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power , 2019 .

[31]  Suyash Bhamore Decrypting Google’s Search Engine Bias Case: Anti-Trust Enforcement in the Digital Age , 2019, Christ University Law Journal.

[32]  Eva-Patricia Fernández-Manzano,et al.  Analytic surveillance: Big data business models in the time of privacy awareness , 2018 .

[33]  Nick Wilson,et al.  Just ask Siri? A pilot study comparing smartphone digital assistants and laptop Google searches for smoking cessation advice , 2018, PloS one.

[34]  Kim Bartel Sheehan,et al.  Crowdsourcing research: Data collection with Amazon’s Mechanical Turk , 2018 .

[35]  David Lazer,et al.  Suppressing the Search Engine Manipulation Effect (SEME) , 2017, Proc. ACM Hum. Comput. Interact..

[36]  Tony Doyle,et al.  Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy , 2017, Inf. Soc..

[37]  Charles L. A. Clarke,et al.  The Positive and Negative Influence of Search Results on People's Decisions about the Efficacy of Medical Treatments , 2017, ICTIR.

[38]  Julian Unkel,et al.  Ranking versus reputation: perception and effects of search result credibility , 2017, Behav. Inf. Technol..

[39]  M. Iorga,et al.  “Alexa, Can I Trust You?” , 2017, Computer.

[40]  William W. Cohen,et al.  Social Influences on Online Political Information Search and Evaluation , 2017 .

[41]  Karrie Karahalios,et al.  "Be Careful; Things Can Be Worse than They Appear": Understanding Biased Algorithms and Users' Behavior Around Them in Rating Platforms , 2017, ICWSM.

[42]  Maya Cakmak,et al.  Toys that Listen: A Study of Parents, Children, and Internet-Connected Toys , 2017, CHI.

[43]  G. Dong,et al.  Short-term Internet search using makes people rely on search engines when facing unknown issues , 2017, PloS one.

[44]  P. Schulz,et al.  Manipulating Google’s Knowledge Graph Box to Counter Biased Information Processing During an Online Search on Vaccination: Application of a Technological Debiasing Strategy , 2016, Journal of medical Internet research.

[45]  Meg Leta Jones,et al.  Can (and should) Hello Barbie keep a secret? , 2016, 2016 IEEE International Symposium on Ethics in Engineering, Science and Technology (ETHICS).

[46]  N. Ravaja,et al.  Negativity Bias in Media Multitasking: The Effects of Negative Social Media Messages on Attention to Television News Broadcasts , 2016, PloS one.

[47]  Maurizio Borghi,et al.  Search Engine Liability for Autocomplete Suggestions: Personality, Privacy and the Power of the Algorithm , 2015, Int. J. Law Inf. Technol..

[48]  Ronald E. Robertson,et al.  The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections , 2015, Proceedings of the National Academy of Sciences.

[49]  Benjamin K. Johnson,et al.  Political Online Information Searching in Germany and the United States: Confirmation Bias, Source Credibility, and Attitude Impacts , 2015 .

[50]  Frank A. Pasquale The Black Box Society: The Secret Algorithms That Control Money and Information , 2015 .

[51]  Robyn L. Kondrad,et al.  Can't stop believing: inhibitory control and resistance to misleading testimony. , 2014, Developmental science.

[52]  Peter Johannes Schulz,et al.  The Impact of Search Engine Selection and Sorting Criteria on Vaccination Beliefs and Attitudes: Two Experiments Manipulating Google Output , 2014, Journal of medical Internet research.

[53]  Reem Alzahabi,et al.  Children Show Selective Trust in Technological Informants , 2013 .

[54]  Engin Bozdag,et al.  Bias in algorithmic filtering and personalization , 2013, Ethics and Information Technology.

[55]  Paul Baker,et al.  ‘Why do white people have thin lips?’ Google and the perpetuation of stereotypes via auto-complete search forms , 2013 .

[56]  Brian R. Christian,et al.  The Most Human Human: What Artificial Intelligence Teaches Us About Being Alive , 2012 .

[57]  Eli Pariser,et al.  The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think , 2012 .

[58]  R. Crutzen,et al.  An artificially intelligent chat agent that answers adolescents' questions related to sex, drugs, and alcohol: an exploratory study. , 2011, The Journal of adolescent health : official publication of the Society for Adolescent Medicine.

[59]  Matthew Fuller,et al.  Personal Web searching in the age of semantic capitalism: Diagnosing the mechanisms of personalisation , 2011, First Monday.

[60]  Benjamin Endelman Bias in Search Results?: Diagnosis and Response , 2011, Indian Journal of Law and Technology.

[61]  V. Jaswal,et al.  Young Children Have a Specific, Highly Robust Bias to Trust Testimony , 2010, Psychological science.

[62]  Susan L. Gerhart,et al.  Do Web search engines suppress controversy? , 2004, First Monday.

[63]  L Carretié,et al.  Emotion, attention, and the 'negativity bias', studied through event-related potentials. , 2001, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[64]  Elizabeth F Loftus,et al.  Leading questions and the eyewitness report , 1975, Cognitive Psychology.

[65]  Artur Strzelecki,et al.  Direct Answers in Google Search Results , 2020, IEEE Access.

[66]  Cynthia J. Larose,et al.  Children's Online Privacy Protection Act , 2015 .

[67]  Todd G. Shields,et al.  The Persuadable Voter , 2014 .

[68]  Patricia A. Ganea,et al.  Dealing with conflicting information: young children's reliance on what they see versus what they are told. , 2010, Developmental science.

[69]  William G. Mayer The Swing Voter in American Politics , 2008 .