European Union Regulations on Algorithmic Decision-Making and a "Right to Explanation"

We summarize the potential impact that the European Union’s new General Data Protection Regulation will have on the routine use of machine learning algorithms. Slated to take effect as law across the EU in 2018, it will restrict automated individual decision-making (that is, algorithms that make decisions based on user-level predictors) which “significantly affect” users. The law will also effectively create a “right to explanation,” whereby a user can ask for an explanation of an algorithmic decision that was made about them. We argue that while this law will pose large challenges for industry, it highlights opportunities for computer scientists to take the lead in designing algorithms and evaluation frameworks which avoid discrimination and enable explanation.

[1]  A. Tversky,et al.  Judgment under Uncertainty: Heuristics and Biases , 1974, Science.

[2]  D. Aigner,et al.  Statistical Theories of Discrimination in Labor Markets , 1977 .

[3]  Legislation , 1980 .

[4]  Andrew Holmes,et al.  Mortgage Redlining: Race, Risk, and Demand , 1994 .

[5]  D. Woodruff,et al.  The Shape of the River: Long-Term Consequences of Considering Race in College and University Admissions , 1998 .

[6]  James C. Blackburn The Shape of the River , 1999 .

[7]  Julia M. Fromholz The European Union Data Privacy Directive , 2000 .

[8]  Mireille Hildebrandt,et al.  Defining Profiling: A New Type of Knowledge? , 2008, Profiling the European Citizen.

[9]  C. McDermott Discrimination , 2009, Inclusive Equality.

[10]  Toon Calders,et al.  Three naive Bayes approaches for discrimination-free classification , 2010, Data Mining and Knowledge Discovery.

[11]  Toon Calders,et al.  Handling Conditional Discrimination , 2011, 2011 IEEE 11th International Conference on Data Mining.

[12]  Josep Domingo-Ferrer,et al.  Discrimination prevention in data mining for intrusion and crime detection , 2011, 2011 IEEE Symposium on Computational Intelligence in Cyber Security (CICS).

[13]  C. Allen,et al.  Stanford Encyclopedia of Philosophy , 2011 .

[14]  S. Danziger,et al.  Extraneous factors in judicial decisions , 2011, Proceedings of the National Academy of Sciences.

[15]  S. Mullainathan,et al.  Do Judges Vary in Their Treatment of Race? , 2012, The Journal of Legal Studies.

[16]  C. Kuner The European Commission's Proposed Data Protection Regulation: A Copernican Revolution in European Data Protection Law , 2012 .

[17]  Bettina Berendt,et al.  Exploring Discrimination: A User-centric Evaluation of Discrimination-Aware Data Mining , 2012, 2012 IEEE 12th International Conference on Data Mining Workshops.

[18]  Paulo J. G. Lisboa,et al.  Making machine learning models interpretable , 2012, ESANN.

[19]  Bettina Berendt,et al.  Better decision support through exploratory discrimination-aware data mining: foundations and empirical evidence , 2014, Artificial Intelligence and Law.

[20]  Matthias Leese,et al.  The new profiling: Algorithms, black boxes, and the failure of anti-discriminatory safeguards in the European Union , 2014 .

[21]  Karrie Karahalios,et al.  Auditing Algorithms : Research Methods for Detecting Discrimination on Internet Platforms , 2014 .

[22]  Rupanjali Dive,et al.  AN APPROACH FOR DISCRIMINATION PREVENTION IN DATA MINING , 2014 .

[23]  B. Fitzgerald Civil Rights, Big Data, and Our Algorithmic Future , 2015 .

[24]  Carlos Eduardo Scheidegger,et al.  Certifying and Removing Disparate Impact , 2014, KDD.

[25]  Frank A. Pasquale The Black Box Society: The Secret Algorithms That Control Money and Information , 2015 .

[26]  Andrew D. Selbst,et al.  Big Data's Disparate Impact , 2016 .

[27]  Kush R. Varshney,et al.  Proceedings of the 2016 ICML Workshop on Human Interpretability in Machine Learning (WHI 2016) , 2016, ArXiv.

[28]  Jenna Burrell,et al.  How the machine ‘thinks’: Understanding opacity in machine learning algorithms , 2016 .

[29]  H. Laqueur,et al.  Machines Learning Justice: The Case for Judgmental Bootstrapping of Legal Decisions , 2016 .

[30]  Percy Liang,et al.  Data Recombination for Neural Semantic Parsing , 2016, ACL.

[31]  Seth Flaxman,et al.  EU regulations on algorithmic decision-making and a "right to explanation" , 2016, ArXiv.

[32]  Sofia Ranchordás The Black Box Society: The Secret Algorithms That Control Money and Information , 2016 .

[33]  Gorjan Alagic,et al.  #p , 2019, Quantum information & computation.

[34]  Yair Zick,et al.  Algorithmic Transparency via Quantitative Input Influence , 2017 .

[35]  Agustí Verde Parera,et al.  General data protection regulation , 2018 .