Algorithmic injustice: a relational ethics approach

被引:171
作者
Birhane, Abeba [1 ,2 ]
机构
[1] Univ Coll Dublin, Sch Comp Sci, Dublin, Ireland
[2] Lero Irish Software Res Ctr, Dublin, Ireland
来源
PATTERNS | 2021年 / 2卷 / 02期
基金
爱尔兰科学基金会;
关键词
Afro-feminism; artificial intelligence; complex systems; data science; DSML 1: Concept: Basic principles of a new data science output observed and reported; embodiment; enaction; ethics; justice; machine learning; relational epistemology;
D O I
10.1016/j.patter.2021.100205
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It has become trivial to point out that algorithmic systems increasingly pervade the social sphere. Improved efficiency-the hallmark of these systems-drives their mass integration into day-to-day life. However, as a robust body of research in the area of algorithmic injustice shows, algorithmic systems, especially when used to sort and predict social outcomes, are not only inadequate but also perpetuate harm. In particular, a persistent and recurrent trend within the literature indicates that society's most vulnerable are disproportionally impacted. When algorithmic injustice and harm are brought to the fore, most of the solutions on offer (1) revolve around technical solutions and (2) do not center disproportionally impacted communities. This paper proposes a fundamental shift-from rational to relational-in thinking about personhood, data, justice, and everything in between, and places ethics as something that goes above and beyond technical solutions. Outlining the idea of ethics built on the foundations of relationality, this paper calls for a rethinking of justice and ethics as a set of broad, contingent, and fluid concepts and down-to-earth practices that are best viewed as a habit and not a mere methodology for data science. As such, this paper mainly offers critical examinations and reflection and not "solutions.''
引用
收藏
页数:9
相关论文
共 70 条
[1]  
Affectiva, 2020, AFF HUM PERC AN COMP
[2]  
Ahmed S., 2007, Feminist Theory, V8, P149, DOI DOI 10.1177/1464700107078139
[3]  
Ajunwa I., 2016, YAL LAW SCH INF SOC
[4]  
AliKhaldi M, 2001, PHILOS REV, V110, P469
[5]  
Angwin J., 2016, Machine bias: Theres software used across the country to predict future criminals. and its biased against blacks
[6]  
[Anonymous], 2019, P NAACL 2019
[7]  
[Anonymous], 1984, PHILOS WRITINGS DESC
[8]  
Bakhtin MM., 1984, Problems of Dostoevskys poetics, DOI 10.5749/j.ctt22727z1
[9]  
Bar On Bat-Ami., Feminist Epistemologies, P83
[10]  
Bayes T., 1763, Philos. Trans. R. Soc. London, V53, P370