Investigating Bias in Facial Analysis Systems: A Systematic Review

被引:40
作者
Khalil, Ashraf [1 ]
Ahmed, Soha Glal [1 ]
Khattak, Asad Masood [2 ]
Al-Qirim, Nabeel [3 ]
机构
[1] Abu Dhabi Univ, Coll Engn, Abu Dhabi 59911, U Arab Emirates
[2] Zayed Univ, Coll Technol Innovat, Abu Dhabi 144534, U Arab Emirates
[3] United Arab Emirates Univ, Coll Informat Technol, Al Ain 15551, U Arab Emirates
关键词
Algorithmic discrimination; classification bias; facial analysis; bias; unfairness; DATABASE;
D O I
10.1109/ACCESS.2020.3006051
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent studies have demonstrated that most commercial facial analysis systems are biased against certain categories of race, ethnicity, culture, age and gender. The bias can be traced in some cases to the algorithms used and in other cases to insufficient training of algorithms, while in still other cases bias can be traced to insufficient databases. To date, no comprehensive literature review exists which systematically investigates bias and discrimination in the currently available facial analysis software. To address the gap, this study conducts a systematic literature review (SLR) in which the context of facial analysis system bias is investigated in detail. The review, involving 24 studies, additionally aims to identify (a) facial analysis databases that were created to alleviate bias, (b) the full range of bias in facial analysis software and (c) algorithms and techniques implemented to mitigate bias in facial analysis.
引用
收藏
页码:130751 / 130761
页数:11
相关论文
共 56 条
[1]   Deep facial analysis: A new phase I epilepsy evaluation using computer vision [J].
Ahmedt-Aristizabal, David ;
Fookes, Clinton ;
Kien Nguyen ;
Denman, Simon ;
Sridharan, Sridha ;
Dionisio, Sasha .
EPILEPSY & BEHAVIOR, 2018, 82 :17-24
[2]   Menpo: A Comprehensive Platform for Parametric Image Alignment and Visual Deformable Models [J].
Alabort-i-Medina, Joan ;
Antonakos, Epameinondas ;
Booth, James ;
Snape, Patrick ;
Zafeiriou, Stefanos .
PROCEEDINGS OF THE 2014 ACM CONFERENCE ON MULTIMEDIA (MM'14), 2014, :679-682
[3]  
Amos B, 2016, CMUCS16118, DOI DOI 10.1080/09541449108406221
[4]  
Angwin Julia, 2016, Machine bias: There's software used across the country to predict future criminals. and it's biased against blacks
[5]  
[Anonymous], 2018, C FAIRNESS ACCOUNTAB, DOI DOI 10.1145/3357384.3357857
[6]  
[Anonymous], FAC REC VEND TEST FR
[7]  
Botsman R., 2017, Wired
[8]  
Brandao M., 2019, WORKSHOP FAIRNESS AC
[9]   How far are we from solving the 2D & 3D Face Alignment problem? (and a dataset of 230,000 3D facial landmarks) [J].
Bulat, Adrian ;
Tzimiropoulos, Georgios .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :1021-1030
[10]  
Buolamwini Joy, 2018, P MACHINE LEARNING R