Opening the 'black box' of HRM algorithmic biases - How hiring practices induce discrimination on freelancing platforms

被引:2
作者
Trautwein, Yannik [1 ]
Zechiel, Felix [1 ]
Coussement, Kristof [2 ]
Meire, Matthijs [2 ]
Buettgen, Marion [1 ]
机构
[1] Univ Hohenheim, Inst Mkt & Management, Chair Corp Management, Schwerzstr 42, D-70599 Stuttgart, Germany
[2] Univ Lille, IESEG Sch Management, CNRS, UMR 9221 LEM Lille Econ Management, 3 Rue Digue, F-59000 Lille, France
关键词
Algorithmic HRM; Artificial intelligence; Discrimination; Gig economy; Algorithmic bias; AGE-DISCRIMINATION; FIELD EXPERIMENT; SOCIAL-STATUS; GIG ECONOMY; ARTIFICIAL-INTELLIGENCE; GENDER; RACE; IDENTITY; STEREOTYPES; PERFORMANCE;
D O I
10.1016/j.jbusres.2025.115298
中图分类号
F [经济];
学科分类号
02 ;
摘要
Online freelancing platforms extensively apply algorithms and AI, for example, to rank freelancers. These platforms are often considered neutral for not displaying freelancers' gender, race, and age, but recent studies have revealed mounting freelancer complaints of unfair treatment and discrimination stemming from the platforms' algorithms. Drawing from social dominance theory, this study contributes to the algorithmic HRM literature by uncovering an indirect algorithmic discrimination mechanism explaining bias in algorithmic rankings. By using an Upwork dataset of 44,167 freelancers and leveraging structural equation modeling, we find that the number of jobs completed through the platform mediates the effects of gender, race, and age on the platform's ranking, demonstrating discrimination against female, Black women, Asian, and younger candidates. The study's theoretical contributions to the algorithmic HRM literature, the methodological contribution of a novel AI picture analysis tool, and managerial implications for online freelancing platforms and HR departments are discussed.
引用
收藏
页数:14
相关论文
共 108 条
[21]   Racial Discrimination in the Sharing Economy: Evidence from a Field Experiment [J].
Edelman, Benjamin ;
Luca, Michael ;
Svirsky, Dan .
AMERICAN ECONOMIC JOURNAL-APPLIED ECONOMICS, 2017, 9 (02) :1-22
[22]   Gender Stereotypes [J].
Ellemers, Naomi .
ANNUAL REVIEW OF PSYCHOLOGY, VOL 69, 2018, 69 :275-298
[23]  
Fang Hanming, 2011, Handbook of social economics, V1, P133, DOI DOI 10.1016/B978-0-444-53187-2.00005-X
[24]   Cultural Change as Learning: The Evolution of Female Labor Force Participation over a Century [J].
Fernandez, Raquel .
AMERICAN ECONOMIC REVIEW, 2013, 103 (01) :472-500
[25]   Fissures in algorithmic power: platforms, code, and contestation [J].
Ferrari, Fabian ;
Graham, Mark .
CULTURAL STUDIES, 2021, 35 (4-5) :814-832
[26]   Unfairness by Design? The Perceived Fairness of Digital Labor on Crowdworking Platforms [J].
Fieseler, Christian ;
Bucher, Eliane ;
Hoffmann, Christian Pieter .
JOURNAL OF BUSINESS ETHICS, 2019, 156 (04) :987-1005
[27]   A model of (often mixed) stereotype content: Competence and warmth respectively follow from perceived status and competition [J].
Fiske, ST ;
Cuddy, AJC ;
Glick, P ;
Xu, J .
JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 2002, 82 (06) :878-902
[28]   Bias in computer systems [J].
Friedman, B ;
Nissenbaum, H .
ACM TRANSACTIONS ON INFORMATION SYSTEMS, 1996, 14 (03) :330-347
[29]  
Ge Y., 2016, Racial and gender discrimination in transportation network companies (No. w22776)
[30]   The dark sides of people analytics: reviewing the perils for organisations and employees [J].
Giermindl, Lisa Marie ;
Strich, Franz ;
Christ, Oliver ;
Leicht-Deobald, Ulrich ;
Redzepi, Abdullah .
EUROPEAN JOURNAL OF INFORMATION SYSTEMS, 2022, 31 (03) :410-435