Employers often struggle to assess qualified applicants, particularly in contexts where they receive hundreds of applications for job openings. In an effort to increase efficiency and improve the process, many have begun employing new tools to sift through these applications, looking for signals that a candidate is “the best fit.” Some companies use tools that offer algorithmic assessments of workforce data to identify the variables that lead to stronger employee performance, or to high employee attrition rates, while others turn to third party ranking services to identify the top applicants in a labor pool. Still others eschew automated systems, but rely heavily on publicly available data to assess candidates beyond their applications. For example, some HR managers turn to LinkedIn to determine if a candidate knows other employees or to identify additional information about them or their networks. Although most companies do not intentionally engage in discriminatory hiring practices (particularly on the basis of protected classes), their reliance on automated systems, algorithms, and existing networks systematically benefits some at the expense of others, often without employers even recognizing the biases of such mechanisms. The intersection of hiring practices and the Big Data phenomenon has not produced inherently new challenges. While this paper addresses issues of privacy, fairness, transparency, accuracy, and inequality under the rubric of discrimination, it does not pivot solely around the legal definitions of discrimination under current federal anti-discrimination law. Rather, it describes a number of areas where issues of inherent bias intersect with, or come into conflict with, socio-cultural notions of fairness.
[1]
Andrew D. Selbst,et al.
Big Data's Disparate Impact
,
2016
.
[2]
Amy L Allbright,et al.
2004 employment decisions under the ADA Title I--survey update.
,
2005,
Mental and physical disability law reporter.
[3]
Kate Crawford,et al.
What is a flag for? Social media reporting tools and the vocabulary of complaint
,
2016,
New Media Soc..
[4]
Latanya Sweeney,et al.
Discrimination in online ad delivery
,
2013,
CACM.
[5]
S. Assouline,et al.
Advocacy
,
2012,
Developing Math Talent.
[6]
Sharona Hoffman,et al.
Employing E-Health: The Impact of Electronic Health Records on the Workplace
,
2010
.
[7]
Toon Calders,et al.
Why Unbiased Computational Processes Can Lead to Discriminative Decision Procedures
,
2013,
Discrimination and Privacy in the Information Society.
[8]
Martha A. Poon.
What lenders see -- : a history of the Fair Isaac scorecard
,
2012
.
[9]
Danielle K. Citron.
Big Data Should Be Regulated by ‘Technological Due Process’
,
2016
.