Special Issue

Computational Ethics and Accountability are becoming topics of increasing societal impact; in particular, on the one hand, in the context of recent advances in AI and machine-learning techniques, people and organizations accept decisions made for them by machines, be they buy-sell decisions, pre-filtering of applications, deciding which content users are presented, which personal data are shared and used by third parties, up to automated driving. In each of these application scenarios, where algorithms and machines support or even replace human decisions, ethical issues may arise. On the other hand, algorithms and machines can play the role of verifying and cross-checking compliance of human players in analyzing digital records of social interactions, for instance, in business transactions and processes, but also in following rules of conduct in online social interactions. Closing the circle, based on such checks, again automated decisions may be implied that involve ethical requirements, such as nondiscrimination and fairness. Apart from infamous “trolley problems” (Edmonds 2013), where even philosophers struggle to judge what is the “right” decision and going either way has dramatic impacts, there are more subtle everyday decisions that we now either happily delegate to machines or that are digitally recorded, which may have ethical implications: handling of personal data has to follow strict legal regulations, especially in social networks and in re-sharing personal data with businesses, (social) norms should be followed also in domains where automated agents enter interactions that were typically executed by human actors only, and fair business practices should be ensured within business processes, compliant with regulations, laws, and best practices. In all these areas, at the very least, we expect transparency and accountability from automated decision and decision support systems: that is, we should require these systems to be transparent about how they make decisions and knowing who is accountable for those decisions and their effects. Many voices demand a more responsible technology and engineering approach, such as articulated in the Copenhagen Letter (Techfestival 2017), or recent initiatives to standardize value-based ethically compliant system design, such as IEEE’s P7000 (for Ethical Life-Cycle Concerns Working Group (EMELC-WG) 2017) family of standards.