Some Ethical and Legal Issues in Using Artificial Intelligence in Personnel Selection

被引:0
作者
Ringelband, Olaf [1 ,3 ]
Warneke, Christian [2 ]
机构
[1] Univ Hamburg, Dept Psychol, Hamburg, Germany
[2] Hamburg Univ Appl Sci, Dept Publ Management, Hamburg, Germany
[3] MD Management Diagnost, Alsterufer 37, D-20354 Hamburg, Germany
关键词
artificial intelligence; assessment; big data; ethical tensions; personnel selection; PSYCHOLOGY;
D O I
10.1037/cpb0000289
中图分类号
B849 [应用心理学];
学科分类号
040203 ;
摘要
The present article identifies four types of data that can be analyzed using artificial intelligence- (AI-) powered hiring tools assessments in various stages of the selection process: (a) data from traditional psychometric instruments, (b) data from AI-based assessment tools, (c) data from the "digital footprint" of given individuals, and (d) direct and indirect observations. In all methods, AI makes predictions and supports hiring/promotion decisions using historic data ("training data"). Understanding both the nature of these data and the underlying functionality of the AI is crucial when it comes to evaluating ethical and legal issues that might arise when employing AI in assessments, such as discrimination and inequities. Questions addressed here include the following: (1) Does a bias already exist in the data on which the AI system has been trained? (2) Does the algorithm have parameters based on psychological theories and evidence? (3) Does the AI system allow the "black box" to be opened? The article also notes that AI-driven assessments have the potential to deliver more precise evaluations of individuals. Additionally, because these assessments rely on a wealth of data and on numerous interactions drawn from real-life circumstances, they can address typical rather than merely maximal behavior, with the typical case capable of being measured via psychometric tests. The article furthermore discusses a number of ethical issues that arise in personnel selection using AI.
引用
收藏
页数:15
相关论文
共 53 条
  • [1] Aarthy R., 2024, Medium
  • [2] American Psychological Association, 2020, Guidelines for psychological assessment and evaluation
  • [3] [Anonymous], 2010, Am Psychol, V65, P493, DOI 10.1037/a0020168
  • [4] Show Me How You Click, and I'll Tell You What You Can: Predicting User Competence and Performance by Mouse Interaction Parameters
    Attig, Christiane
    Then, Ester
    Krems, Josef F.
    [J]. INTELLIGENT HUMAN SYSTEMS INTEGRATION 2019, 2019, 903 : 801 - 806
  • [5] Avgustin V., 2022, Evaluating discrimination bias in AI decision making systems for personnel selection
  • [6] Benhmama A., 2023, Revue Franaise dEconomie et de Gestion, V4
  • [7] Berberich N., 2019, Wie Maschinen lernen, P11, DOI [10.1007/978-3-658-26763-62, DOI 10.1007/978-3-658-26763-62]
  • [8] Cecil J., 2023, The effect of AI-generated advice on decision-making in personnel selection, DOI [10.31219/osf.io/349xe, DOI 10.31219/OSF.IO/349XE]
  • [9] Smart Consumer Wearables as Digital Diagnostic Tools: A Review
    Chakrabarti, Shweta
    Biswas, Nupur
    Jones, Lawrence D.
    Kesari, Santosh
    Ashili, Shashaanka
    [J]. DIAGNOSTICS, 2022, 12 (09)
  • [10] Collaboration among recruiters and artificial intelligence: removing human prejudices in employment
    Chen, Zhisheng
    [J]. COGNITION TECHNOLOGY & WORK, 2023, 25 (01) : 135 - 149