Australian women's judgements about using artificial intelligence to read mammograms in breast cancer screening

Objective Mammographic screening for breast cancer is an early use case for artificial intelligence (AI) in healthcare. This is an active area of research, mostly focused on the development and evaluation of individual algorithms. A growing normative literature argues that AI systems should reflect human values, but it is unclear what this requires in specific AI implementation scenarios. Our objective was to understand women's values regarding the use of AI to read mammograms in breast cancer screening. Methods We ran eight online discussion groups with a total of 50 women, focused on their expectations and normative judgements regarding the use of AI in breast screening. Results Although women were positive about the potential of breast screening AI, they argued strongly that humans must remain as central actors in breast screening systems and consistently expressed high expectations of the performance of breast screening AI. Women expected clear lines of responsibility for decision-making, to be able to contest decisions, and for AI to perform equally well for all programme participants. Women often imagined both that AI might replace radiographers and that AI implementation might allow more women to be screened: screening programmes will need to communicate carefully about these issues. Conclusions To meet women's expectations, screening programmes should delay implementation until there is strong evidence that the use of AI systems improves screening performance, should ensure that human expertise and responsibility remain central in screening programmes, and should avoid using AI in ways that exacerbate inequities.

[1]  W. Rogers,et al.  Utopia versus dystopia: Professional perspectives on the impact of healthcare artificial intelligence on clinical roles and skills , 2022, Int. J. Medical Informatics.

[2]  G. Pravettoni,et al.  Women's perceptions and attitudes to the use of AI in breast cancer screening: a survey in a cancer referral centre. , 2022, The British journal of radiology.

[3]  Soumya Banerjee,et al.  Patient and public involvement to build trust in artificial intelligence: A framework, tools, and case studies , 2022, Patterns.

[4]  D. Treanor,et al.  Public governance of medical artificial intelligence research in the UK: an integrated multi-scale model , 2022, Research Involvement and Engagement.

[5]  S. Carter,et al.  The Adoption of Artificial Intelligence in Health Care and Social Services in Australia: Findings From a Methodologically Innovative National Survey of Values and Attitudes (the AVA-AI Study) , 2022, Journal of medical Internet research.

[6]  S. Duffy,et al.  Benefits and harms of annual, biennial, or triennial breast cancer mammography screening for women at average risk of breast cancer: a systematic review for the European Commission Initiative on Breast Cancer (ECIBC) , 2021, British journal of cancer.

[7]  L. Celi,et al.  An interactive dashboard to track themes, development maturity, and global equity in clinical artificial intelligence research , 2021, medRxiv.

[8]  S. Taylor-Phillips,et al.  Use of artificial intelligence for image analysis in breast cancer screening programmes: systematic review of test accuracy , 2021, BMJ.

[9]  Daniel E. Ho,et al.  How medical AI devices are evaluated: limitations and recommendations from an analysis of FDA approvals , 2021, Nature Medicine.

[10]  J. James,et al.  Women’s attitudes to the use of AI image readers: a case study from a national breast screening programme , 2021, BMJ Health & Care Informatics.

[11]  Derya Yakar,et al.  Artificial Intelligence in Screening Mammography: Is the General Population Ready? , 2020, Journal of the American College of Radiology : JACR.

[12]  Derya Yakar,et al.  Artificial Intelligence in Screening Mammography: Is the General Population Ready? , 2020, Journal of the American College of Radiology : JACR.

[13]  Y. Brandberg,et al.  The future of breast cancer screening: what do participants in a breast cancer screening program think about automation using artificial intelligence? , 2019, Acta radiologica open.

[14]  Brian W. Powers,et al.  Dissecting racial bias in an algorithm used to manage the health of populations , 2019, Science.

[15]  Khin Than Win,et al.  The ethical, legal and social implications of using artificial intelligence systems in breast cancer care , 2019, Breast.

[16]  T. Walsh The effective and ethical development of artificial intelligence: an opportunity to improve our wellbeing , 2019 .

[17]  Luciano Floridi,et al.  Establishing the rules for building trustworthy AI , 2019, Nature Machine Intelligence.

[18]  J. Denny,et al.  Artificial intelligence, bias and clinical safety , 2019, BMJ Quality & Safety.

[19]  Allan Dafoe,et al.  Artificial Intelligence: American Attitudes and Trends , 2019, SSRN Electronic Journal.

[20]  T. Shakespeare,et al.  Ordinary ethics: lay people's deliberations on social sex selection , 2006 .

[21]  T. Shakespeare,et al.  Gift not commodity? Lay people deliberating social sex selection. , 2006, Sociology of health & illness.

[22]  Bjørn Hofmann,et al.  IS THERE A TECHNOLOGICAL IMPERATIVE IN HEALTH CARE? , 2002, International Journal of Technology Assessment in Health Care.

[23]  Stuart Ross Australian Bureau of Statistics , 2021, The Encyclopedia of Research Methods in Criminology and Criminal Justice.