Can Algorithms Legitimize Discrimination?

被引:55
作者
Bonezzi, Andrea [1 ]
Ostinelli, Massimiliano [2 ]
机构
[1] NYU, Stern Sch Business, 40 West 4th St, New York, NY 10012 USA
[2] Winthrop Univ, Coll Business Adm, Rock Hill, SC 29733 USA
关键词
algorithmic bias; discrimination; disparities; inequality; algorithm aversion; BIG DATA; BIAS; PERCEPTIONS; PREDICTION; ANALYTICS; HEALTH; MANAGE; TESTS; RISK;
D O I
10.1037/xap0000294
中图分类号
B849 [应用心理学];
学科分类号
040203 ;
摘要
Algorithms have been the subject of a heated debate regarding their potential to yield biased decisions. Prior research has focused on documenting algorithmic bias and discussing its origins from a technical standpoint. We look at algorithmic bias from a psychological perspective, raising a fundamental question that has received little attention: are people more or less likely to perceive decisions that yield disparities as biased, when such decisions stem from algorithms as opposed to humans? We find that algorithmic decisions that yield gender or racial disparities are less likely to be perceived as biased than human decisions. This occurs because people believe that algorithms, unlike humans, decontextualize decision-making by neglecting individual characteristics and blindly applying rules and procedures irrespective of whom they are judging. In situations that entail the potential for discrimination, this belief leads people to think that algorithms are more likely than humans to treat everyone equally, thus less likely to yield biased decisions. This asymmetrical perception of bias, which occurs both in the general population and among members of stigmatized groups, leads people to endorse stereotypical beliefs that fuel discrimination and reduces their willingness to act against potentially discriminatory outcomes.
引用
收藏
页码:447 / 459
页数:13
相关论文
共 58 条
[1]  
[Anonymous], 2016, COMPAS RISK SCALES: DEMONSTRATING ACCURACY EQUITY AND PREDICTIVE PARITY
[2]  
[Anonymous], Machine bias risk assessments in criminal sentencing
[3]   Big Data's Disparate Impact [J].
Barocas, Solon ;
Selbst, Andrew D. .
CALIFORNIA LAW REVIEW, 2016, 104 (03) :671-732
[4]  
Bartlett Robert, 2019, ERA TECHNICAL REPORT, DOI DOI 10.3386/W25943
[5]   Big Data In Health Care: Using Analytics To Identify And Manage High-Risk And High-Cost Patients [J].
Bates, David W. ;
Saria, Suchi ;
Ohno-Machado, Lucila ;
Shah, Anand ;
Escobar, Gabriel .
HEALTH AFFAIRS, 2014, 33 (07) :1123-1131
[6]  
Bogen M., 2018, HELP WANTED EXPLORAT
[8]  
Bonezzi A., 2020, CAN ALGORITHMS LEGIT, DOI [10.17605/OSF.IO/276GM, DOI 10.17605/OSF.IO/276GM]
[9]   Task-Dependent Algorithm Aversion [J].
Castelo, Noah ;
Bos, Maarten W. ;
Lehmann, Donald R. .
JOURNAL OF MARKETING RESEARCH, 2019, 56 (05) :809-825
[10]   Perceptions of Structural Injustice and Efficacy: Participation in Low/Moderate/High-Cost Forms of Collective Action [J].
Corcoran, Katie E. ;
Pettinicchio, David ;
Young, Jacob T. N. .
SOCIOLOGICAL INQUIRY, 2015, 85 (03) :429-461