Where fairness fails: data, algorithms, and the limits of antidiscrimination discourse

被引:246
作者
Hoffmann, Anna Lauren [1 ]
机构
[1] Univ Washington, Informat Sch, Seattle, WA 98195 USA
关键词
Big data; algorithms; antidiscrimination; social justice; intersectionality; DISCRIMINATION; INFORMATION; RACE; BLIND; BIAS;
D O I
10.1080/1369118X.2019.1573912
中图分类号
G2 [信息与知识传播];
学科分类号
05 ; 0503 ;
摘要
Problems of bias and fairness are central to data justice, as they speak directly to the threat that big data' and algorithmic decision-making may worsen already existing injustices. In the United States, grappling with these problems has found clearest expression through liberal discourses of rights, due process, and antidiscrimination. Work in this area, however, has tended to overlook certain established limits of antidiscrimination discourses for bringing about the change demanded by social justice. In this paper, I engage three of these limits: 1) an overemphasis on discrete bad actors', 2) single-axis thinking that centers disadvantage, and 3) an inordinate focus on a limited set of goods. I show that, in mirroring some of antidiscrimination discourse's most problematic tendencies, efforts to achieve fairness and combat algorithmic discrimination fail to address the very hierarchical logic that produces advantaged and disadvantaged subjects in the first place. Finally, I conclude by sketching three paths for future work to better account for the structural conditions against which we come to understand problems of data and unjust discrimination in the first place.
引用
收藏
页码:900 / 915
页数:16
相关论文
共 86 条
[1]  
AI Now Institute, 2018, AI NOW 2018 S VID RE
[2]  
Ajunwa I., 2019, ST LOUIS U LAW REV, V63, P1
[3]  
Alexander M., 2010, The New Jim Crow: Mass Incarceration in the Age of Colorblindness
[4]   Toward an Ethics of Algorithms: Convening, Observation, Probability, and Timeliness [J].
Ananny, Mike .
SCIENCE TECHNOLOGY & HUMAN VALUES, 2016, 41 (01) :93-117
[5]  
Angwin J., 2016, PROPUBLICA
[6]  
[Anonymous], 2016, WEAPONS MATH DESTRUC, DOI DOI 10.5860/CRL.78.3.403
[7]  
[Anonymous], 1993, PANOPTIC SORT POLITI
[8]  
[Anonymous], 2013, Stanford Law Review Online
[9]  
[Anonymous], 2012, LIMN
[10]  
[Anonymous], 2017, ARXIV PREPRINT ARXIV