暂无分享,去创建一个
Melissa Terras | Benjamin Bach | Beatrice Alex | Lucy Havens | Beatrice Alex | B. Bach | Melissa Mhairi Terras | Lucy Havens
[1] Hilary Bradbury,et al. Participatory action research as practice , 2008 .
[2] Anne Welsh,et al. The Rare Books Catalog and the Scholarly Database , 2016 .
[3] Latanya Sweeney,et al. Discrimination in online ad delivery , 2013, CACM.
[4] C. Perez,et al. Invisible Women: Exposing Data Bias in a World Designed for Men , 2020 .
[5] Emily M. Bender,et al. Data Statements for Natural Language Processing: Toward Mitigating System Bias and Enabling Better Science , 2018, TACL.
[6] Steven Bird,et al. NLTK: The Natural Language Toolkit , 2002, ACL 2006.
[7] Yoav Goldberg,et al. Lipstick on a Pig: Debiasing Methods Cover up Systematic Gender Biases in Word Embeddings But do not Remove Them , 2019, NAACL-HLT.
[8] Claudio Cobelli,et al. 11 – Case studies , 2008 .
[9] Adam Tauman Kalai,et al. What are the Biases in My Word Embedding? , 2018, AIES.
[10] Anne Marie Piper,et al. Addressing Age-Related Bias in Sentiment Analysis , 2018, CHI.
[11] Orestis Papakyriakopoulos,et al. Bias in word embeddings , 2020, FAT*.
[12] D. Haraway. Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective , 1988 .
[13] Vicki L. Hanson,et al. Writing about accessibility , 2015, Interactions.
[14] M. Talbot. Gender stereotypes: reproduction and challenge , 2008 .
[15] Noah A. Smith,et al. Evaluating Gender Bias in Machine Translation , 2019, ACL.
[16] W. Frisby,et al. 6 Continuing the Journey: Articulating Dimensions of Feminist Participatory Action Research (FPAR) , 2008 .
[17] Steven Bird,et al. NLTK: The Natural Language Toolkit , 2002, ACL.
[18] K. Crenshaw. Mapping the margins: intersectionality, identity politics, and violence against women of color , 1991 .
[19] T. V. Leeuwen. Discourse as the Recontextualization of Social Practice , 2008 .
[20] I. Gleibs,et al. Are all “research fields” equal? Rethinking practice for the use of data from crowdsourcing market places , 2016, Behavior Research Methods.
[21] Arvind Narayanan,et al. Semantics derived automatically from language corpora contain human-like biases , 2016, Science.
[22] Solon Barocas,et al. Language (Technology) is Power: A Critical Survey of “Bias” in NLP , 2020, ACL.
[23] K. Deaux,et al. The Times They Are a-Changing … or Are They Not? A Comparison of Gender Stereotypes, 1983–2014 , 2016 .
[24] Mary Bucholtz,et al. Theories of Discourse as Theories of Gender: Discourse Analysis in Language and Gender Studies , 2008 .
[25] Gadi Gilam,et al. The dark side of gendered language: The masculine-generic form as a cause for self-report bias. , 2015, Psychological assessment.
[26] Lukas Engelmann,et al. Plague Dot Text: Text Mining and Annotation of Outbreak Reports of the Third Plague Pandemic (1894-1952) , 2019, HistoInformatics@TPDL.
[27] Danushka Bollegala,et al. Gender-preserving Debiasing for Pre-trained Word Embeddings , 2019, ACL.
[28] Sandra Harding,et al. “Strong objectivity”: A response to the new objectivity question , 1995, Synthese.
[29] J C Winck,et al. Times they are a-changing. , 2010, Revista portuguesa de pneumologia.
[30] Roopika Risam,et al. Beyond the Margins: Intersectionality and the Digital Humanities , 2015, Digit. Humanit. Q..
[31] Alan W Black,et al. Measuring Bias in Contextualized Word Representations , 2019, Proceedings of the First Workshop on Gender Bias in Natural Language Processing.
[32] Daniel Jurafsky,et al. Word embeddings quantify 100 years of gender and ethnic stereotypes , 2017, Proceedings of the National Academy of Sciences.
[33] tara mcpherson,et al. Why Are the Digital Humanities So White? or Thinking the Histories of Race and Computation , 2013 .
[34] Andrew Valls,et al. Racism , 2009, The Palgrave Encyclopedia of Imperialism and Anti-Imperialism.
[35] D. Fitch,et al. Review of "Algorithms of oppression: how search engines reinforce racism," by Noble, S. U. (2018). New York, New York: NYU Press. , 2018, CDQR.
[36] Peter Willmott,et al. 11 – Case studies , 2001 .
[37] Helen Nissenbaum,et al. Bias in computer systems , 1996, TOIS.
[38] Yasmeen Hitti,et al. Proposed Taxonomy for Gender Bias in Text; A Filtering Methodology for the Gender Generalization Subtype , 2019, Proceedings of the First Workshop on Gender Bias in Natural Language Processing.
[39] Jason Baldridge,et al. Mind the GAP: A Balanced Corpus of Gendered Ambiguous Pronouns , 2018, TACL.
[40] Adam Tauman Kalai,et al. Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings , 2016, NIPS.
[41] Jieyu Zhao,et al. Gender Bias in Coreference Resolution: Evaluation and Debiasing Methods , 2018, NAACL.
[42] J. Weibler,et al. Discourse , 1984, Language in Society.
[43] M. Caswell,et al. Neither a beginning nor an end , 2019, The Routledge International Handbook of New Digital Practices in Galleries, Libraries, Archives, Museums and Heritage Sites.
[44] M. Terras,et al. Of global reach yet of situated contexts: an examination of the implicit and explicit selection criteria that shape digital archives of historical newspapers , 2020, Archival Science.
[45] Haoran Zhang,et al. Hurtful words: quantifying biases in clinical contextual word embeddings , 2020, CHIL.
[46] Michael Rovatsos,et al. Algorithmic Fairness in Online Information Mediating Systems , 2017, WebSci.
[47] Rada Mihalcea,et al. Women’s Syntactic Resilience and Men’s Grammatical Luck: Gender-Bias in Part-of-Speech Tagging and Dependency Parsing , 2019, ACL.