Facesaliencyaug: mitigating geographic, gender and stereotypical biases via saliency-based data augmentation

被引:0
作者
Kumar, Teerath [1 ,2 ]
Mileo, Alessandra [3 ,4 ]
Bendechache, Malika [5 ,6 ]
机构
[1] Dublin City Univ, CRT AI, Sch Comp, Dublin, Ireland
[2] Dublin City Univ, ADAPT Res Ctr, Sch Comp, Dublin, Ireland
[3] Dublin City Univ, INSIGHT, Sch Comp, Dublin, Ireland
[4] Dublin City Univ, I Form Res Ctr, Sch Comp, Dublin, Ireland
[5] Univ Galway, ADAPT & Lero Res Ctr, Sch Comp Sci, Galway, Ireland
[6] Univ Galway, Lero Res Ctr, Sch Comp Sci, Galway, Ireland
基金
爱尔兰科学基金会;
关键词
Bias mitigation; Convolutional neural network; Data augmentation; Data diversity; Saliency augmentation;
D O I
10.1007/s11760-024-03623-1
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Geographical, gender and stereotypical biases in computer vision models pose significant challenges to their performance and fairness. In this study, we present an approach named FaceSaliencyAug aimed at addressing the gender bias in Convolutional Neural Networks (CNNs) and Vision Transformers (ViTs). Leveraging the salient regions of faces detected by saliency, the propose approach mitigates geographical and stereotypical biases in the datasets. FaceSaliencyAug randomly selects masks from a predefined search space and applies them to the salient region of face images, subsequently restoring the original image with masked salient region. The proposed augmentation strategy enhances data diversity, thereby improving model performance and debiasing effects. We quantify dataset diversity using Image Similarity Score (ISS) across five datasets, including Flickr Faces HQ (FFHQ), WIKI, IMDB, Labelled Faces in the Wild (LFW), UTK Faces, and Diverse Dataset. The proposed approach demonstrates superior diversity metrics, as evaluated by ISS-intra and ISS-inter algorithms. Furthermore, we evaluate the effectiveness of our approach in mitigating gender bias on CEO, Engineer, Nurse, and School Teacher datasets. We use the Image-Image Association Score (IIAS) to measure gender bias in these occupations. Our experiments reveal a reduction in gender bias for both CNNs and ViTs, indicating the efficacy of our method in promoting fairness and inclusivity in computer vision models.
引用
收藏
页数:11
相关论文
共 29 条
[1]  
Birhane Abeba., 2021, arXiv, DOI [arXiv:2110.01963, DOI 10.48550/ARXIV.2110.01963]
[2]  
Buolamwini J., 2018, Conference on fairness, accountability and transparency, P77
[3]  
Celis L. Elisa, 2020, Proceedings of the ACM on Human-Computer Interaction, V4, DOI [10.1145/3415210, 10.1145/3415210]
[4]  
Chen PG, 2024, Arxiv, DOI arXiv:2001.04086
[5]   SalfMix: A Novel Single Image-Based Data Augmentation Technique Using a Saliency Map [J].
Choi, Jaehyeop ;
Lee, Chaehyeon ;
Lee, Donggyu ;
Jung, Heechul .
SENSORS, 2021, 21 (24)
[6]   Randaugment: Practical automated data augmentation with a reduced search space [J].
Cubuk, Ekin D. ;
Zoph, Barret ;
Shlens, Jonathon ;
Le, Quoc, V .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, :3008-3017
[7]  
Ghiasi G, 2018, ADV NEUR IN, V31
[8]   FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation [J].
Karkkainen, Kimmo ;
Joo, Jungseock .
2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, :1547-1557
[9]  
Karras Tero, 2019, Nvlabs/ffhq-dataset
[10]   BiaSwap: Removing Dataset Bias with Bias-Tailored Swapping Augmentation [J].
Kim, Eungyeup ;
Lee, Jihyeon ;
Choo, Jaegul .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :14972-14981