Artificial fairness? Trust in algorithmic police decision-making

被引:20
作者
Hobson, Zoe [1 ]
Yesberg, Julia A. [1 ]
Bradford, Ben [1 ]
Jackson, Jonathan [2 ,3 ]
机构
[1] UCL, Inst Global City Policing, Dept Secur & Crime Sci, 35 Tavistock Sq, London WC1H 9EZ, England
[2] London Sch Econ & Polit Sci, Dept Methodol, London, England
[3] Sydney Law Sch, Sydney, NSW, Australia
关键词
Algorithms; Fairness; Police decision-making; Technology; Trust; BODY-WORN CAMERAS; PROCEDURAL JUSTICE; PUBLIC SUPPORT; LEGITIMACY; COOPERATION;
D O I
10.1007/s11292-021-09484-9
中图分类号
DF [法律]; D9 [法律];
学科分类号
0301 ;
摘要
Objectives Test whether (1) people view a policing decision made by an algorithm as more or less trustworthy than when an officer makes the same decision; (2) people who are presented with a specific instance of algorithmic policing have greater or lesser support for the general use of algorithmic policing in general; and (3) people use trust as a heuristic through which to make sense of an unfamiliar technology like algorithmic policing. Methods An online experiment tested whether different decision-making methods, outcomes and scenario types affect judgements about the appropriateness and fairness of decision-making and the general acceptability of police use of this particular technology. Results People see a decision as less fair and less appropriate when an algorithm decides, compared to when an officer decides. Yet, perceptions of fairness and appropriateness were strong predictors of support for police use of algorithms, and being exposed to a successful use of an algorithm was linked, via trust in the decision made, to greater support for police use of algorithms. Conclusions Making decisions solely based on algorithms might damage trust, and the more police rely solely on algorithmic decision-making, the less trusting people may be in decisions. However, mere exposure to the successful use of algorithms seems to enhance the general acceptability of this technology.
引用
收藏
页码:165 / 189
页数:25
相关论文
共 70 条
[1]  
[Anonymous], 2020, PREDICTIVE POLICING, P3
[2]  
Araujo T., 2018, AUTOMATED DECISION M
[3]   Paradoxical effects of self-awareness of being observed: testing the effect of police body-worn cameras on assaults and aggression against officers [J].
Ariel, Barak ;
Sutherland, Alex ;
Henstock, Darren ;
Young, Josh ;
Drover, Paul ;
Sykes, Jayne ;
Megicks, Simon ;
Henderson, Ryan .
JOURNAL OF EXPERIMENTAL CRIMINOLOGY, 2018, 14 (01) :19-47
[4]  
Ashby M., 2020, STOP SEARCH LONDON J
[5]  
Babuta Alexander, 2020, Royal United User Services Institute Occasional Paper
[6]   Big Data's Disparate Impact [J].
Barocas, Solon ;
Selbst, Andrew D. .
CALIFORNIA LAW REVIEW, 2016, 104 (03) :671-732
[7]   'It's Reducing a Human Being to a Percentage'; Perceptions of Justice in Algorithmic Decisions [J].
Binns, Reuben ;
Van Kleek, Max ;
Veale, Michael ;
Lyngs, Ulrik ;
Zhao, Jun ;
Shadbolt, Nigel .
PROCEEDINGS OF THE 2018 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI 2018), 2018,
[8]   LIVE FACIAL RECOGNITION: TRUST AND LEGITIMACY AS PREDICTORS OF PUBLIC SUPPORT FOR POLICE USE OF NEW TECHNOLOGY [J].
Bradford, Ben ;
Yesberg, Julia A. ;
Jackson, Jonathan ;
Dawson, Paul .
BRITISH JOURNAL OF CRIMINOLOGY, 2020, 60 (06) :1502-1522
[9]   Identity, legitimacy and "making sense" of police use of force [J].
Bradford, Ben ;
Milani, Jenna ;
Jackson, Jonathan .
POLICING-AN INTERNATIONAL JOURNAL OF POLICE STRATEGIES & MANAGEMENT, 2017, 40 (03) :614-627
[10]   Contact and confidence: revisiting the impact of public encounters with the police [J].
Bradford, Ben ;
Jackson, Jonathan ;
Stanko, Elizabeth A. .
POLICING & SOCIETY, 2009, 19 (01) :20-46