Enslaving the Algorithm: From a “Right to an Explanation” to a “Right to Better Decisions”?

As concerns about unfairness and discrimination in “black box” machine learning systems rise, a legal “right to an explanation” has emerged as a compellingly attractive approach for challenge and redress. We outline recent debates on the limited provisions in European data protection law, and introduce and analyze newer explanation rights in French administrative law and the draft modernized Council of Europe Convention 108. While individual rights can be useful, in privacy law they have historically unreasonably burdened the average data subject. “Meaningful information” about algorithmic logics is more technically possible than commonly thought, but this exacerbates a new “transparency fallacy”—an illusion of remedy rather than anything substantively helpful. While rights-based approaches deserve a firm place in the toolbox, other forms of governance, such as impact assessments, “soft law,” judicial review, and model repositories deserve more attention, alongside catalyzing agencies acting for users to control algorithmic system design.

[1]  D. Citron Technological Due Process , 2007 .

[2]  Reuben Binns Data Protection Impact Assessments: A Meta-Regulatory Approach , 2016 .

[3]  Joachim Diederich,et al.  The truth will come to light: directions and challenges in extracting the knowledge embedded within trained artificial neural networks , 1998, IEEE Trans. Neural Networks.

[4]  Michael Veale,et al.  Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling , 2018, Comput. Law Secur. Rev..

[5]  Felix Ritchie,et al.  Access to Sensitive Data: Satisfying Objectives Rather than Constraints , 2014 .

[6]  K. Crawford,et al.  Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms , 2013 .

[7]  Luciano Floridi,et al.  Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation , 2017 .

[8]  Frank A. Pasquale The Black Box Society: The Secret Algorithms That Control Money and Information , 2015 .

[9]  Latanya Sweeney,et al.  Discrimination in online ad delivery , 2013, CACM.

[10]  Lee A. Bygrave,et al.  The Right Not to Be Subject to Automated Decisions Based on Profiling , 2017 .

[11]  Carlos Guestrin,et al.  "Why Should I Trust You?": Explaining the Predictions of Any Classifier , 2016, ArXiv.

[12]  Michael Veale,et al.  Slave to the Algorithm? Why a 'Right to an Explanation' Is Probably Not the Remedy You Are Looking For , 2017 .

[13]  M. I. V. Eale,et al.  SLAVE TO THE ALGORITHM ? WHY A ‘ RIGHT TO AN EXPLANATION ’ IS PROBABLY NOT THE REMEDY YOU ARE LOOKING FOR , 2017 .

[14]  Andrew Charlesworth Clash of the Data Titans?: US and EU Data Privacy Regulation , 2000 .

[15]  Alexander Binder,et al.  Explaining nonlinear classification decisions with deep Taylor decomposition , 2015, Pattern Recognit..