The paradox of the artificial intelligence system development process: the use case of corporate wellness programs using smart wearables

被引:6
作者
Angelucci, Alessandra [1 ]
Li, Ziyue [2 ,3 ]
Stoimenova, Niya [4 ]
Canali, Stefano [1 ,5 ]
机构
[1] Politecn Milan, Dipartimento Elettron Informaz & Bioingn, Milan, Italy
[2] Univ Cologne, Fac Management Econ & Social Sci, Cologne Inst Informat Syst, Cologne, Germany
[3] Hong Kong Univ Sci & Technol, Dept Ind Engn & Decis Analyt, Hong Kong, Peoples R China
[4] Delft Univ Technol, Dept Ind Design, Delft, Netherlands
[5] Politecn Milan, META Social Sci & Humanities Sci & Technol, Milan, Italy
关键词
Artificial intelligence; Fairness; Classification model; Corporate wellness program; Smartwatches;
D O I
10.1007/s00146-022-01562-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial intelligence (AI) systems have been widely applied to various contexts, including high-stake decision processes in healthcare, banking, and judicial systems. Some developed AI models fail to offer a fair output for specific minority groups, sparking comprehensive discussions about AI fairness. We argue that the development of AI systems is marked by a central paradox: the less participation one stakeholder has within the AI system's life cycle, the more influence they have over the way the system will function. This means that the impact on the fairness of the system is in the hands of those who are less impacted by it. However, most of the existing works ignore how different aspects of AI fairness are dynamically and adaptively affected by different stages of AI system development. To this end, we present a use case to discuss fairness in the development of corporate wellness programs using smart wearables and AI algorithms to analyze data. The four key stakeholders throughout this type of AI system development process are presented. These stakeholders are called service designer, algorithm designer, system deployer, and end-user. We identify three core aspects of AI fairness, namely, contextual fairness, model fairness, and device fairness. We propose a relative contribution of the four stakeholders to the three aspects of fairness. Furthermore, we propose the boundaries and interactions between the four roles, from which we make our conclusion about the possible unfairness in such an AI developing process.
引用
收藏
页码:1465 / 1475
页数:11
相关论文
共 51 条
  • [11] AI Fairness 360: An extensible toolkit for detecting and mitigating algorithmic bias
    Bellamy, R. K. E.
    Dey, K.
    Hind, M.
    Hoffman, S. C.
    Houde, S.
    Kannan, K.
    Lohia, P.
    Martino, J.
    Mehta, S.
    Mojsilovie, A.
    Nagar, S.
    Ramamurthy, K. Natesan
    Richards, J.
    Saha, D.
    Sattigeri, P.
    Singh, M.
    Varshney, K. R.
    Zhang, Y.
    [J]. IBM JOURNAL OF RESEARCH AND DEVELOPMENT, 2019, 63 (4-5)
  • [12] What Can Political Philosophy Teach Us about Algorithmic Fairness?
    Binns, Reuben
    [J]. IEEE SECURITY & PRIVACY, 2018, 16 (03) : 73 - 80
  • [13] AI-enabled recruiting: What is it and how should a manager use it?
    Black, J. Stewart
    van Esch, Patrick
    [J]. BUSINESS HORIZONS, 2020, 63 (02) : 215 - 226
  • [14] Fair navigation planning: A resource for characterizing and designing fairness in mobile robots
    Brandao, Martim
    Jirotka, Marina
    Webb, Helena
    Luff, Paul
    [J]. ARTIFICIAL INTELLIGENCE, 2020, 282
  • [15] Caton S., 2020, PREPRINT
  • [16] THE AFFORDABLE CARE ACT PERMITS GREATER FINANCIAL REWARDS FOR WEIGHT LOSS: A GOOD IDEA IN PRINCIPLE, BUT MANY PRACTICAL CONCERNS REMAIN
    Cawley, John
    [J]. JOURNAL OF POLICY ANALYSIS AND MANAGEMENT, 2014, 33 (03) : 810 - 820
  • [17] Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments
    Chouldechova, Alexandra
    [J]. BIG DATA, 2017, 5 (02) : 153 - 163
  • [18] Influence of skin type and wavelength on light wave reflectance
    Fallow, Bennett A.
    Tarumi, Takashi
    Tanaka, Hirofumi
    [J]. JOURNAL OF CLINICAL MONITORING AND COMPUTING, 2013, 27 (03) : 313 - 317
  • [19] Farr C., 2020, CNBC
  • [20] Floridi Luciano, 2019, Harv Data Sci Rev, V1, DOI DOI 10.1162/99608F92.8CD550D1