Causal Inference with Knowledge Distilling and Curriculum Learning for Unbiased VQA

被引:28
作者
Pan, Yonghua [1 ]
Li, Zechao [1 ]
Zhang, Liyan [2 ]
Tang, Jinhui [1 ]
机构
[1] Nanjing Univ Sci & Technol, 200 Xiaolingwei St, Nanjing 210094, Jiangsu, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, 29 Yudao St, Nanjing 210016, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Visual question answering; neural networks; knowledge distillation; causal inference;
D O I
10.1145/3487042
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, many Visual Question Answering (VQA) models rely on the correlations between questions and answers yet neglect those between the visual information and the textual information. They would perform badly if the handled data distribute differently from the training data (i.e., out-of-distribution (OOD) data). Towards this end, we propose a two-stage unbiased VQA approach that addresses the unbiased issue from a causal perspective. In the causal inference stage, we mark the spurious correlation on the causal graph, explore the counterfactual causality, and devise a causal target based on the inherent correlations between the conventional and counterfactual VQA models. In the distillation stage, we introduce the causal target into the training process and leverages distilling as well as curriculum learning to capture the unbiased model. Since Causal Inference with Knowledge Distilling and Curriculum Learning (CKCL) reinforces the contribution of the visual information and eliminates the impact of the spurious correlation by distilling the knowledge in causal inference to the VQA model, it contributes to the good performance on both the standard data and out-of-distribution data. The extensive experimental results on VQA-CP v2 dataset demonstrate the superior performance of the proposed method compared to the state-of-the-art (SotA) methods.
引用
收藏
页数:23
相关论文
共 50 条
[21]   Unbiased Causal Inference From an Observational Study: Results of a Within-Study Comparison [J].
Pohl, Steffi ;
Steiner, Peter M. ;
Eisermann, Jens ;
Soellner, Renate ;
Cook, Thomas D. .
EDUCATIONAL EVALUATION AND POLICY ANALYSIS, 2009, 31 (04) :463-479
[22]   Recent Developments in Causal Inference and Machine Learning [J].
Brand, Jennie E. ;
Zhou, Xiang ;
Xie, Yu .
ANNUAL REVIEW OF SOCIOLOGY, 2023, 49 :81-110
[23]   Machine Learning and Causal Inference for Policy Evaluation [J].
Athey, Susan .
KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, :5-6
[24]   Causal ML: Python']Python package for causal inference machine learning [J].
Zhao, Yang ;
Liu, Qing .
SOFTWAREX, 2023, 21
[25]   Deep Knowledge Tracing Integrating Temporal Causal Inference and PINN [J].
Lu, Faming ;
Li, Yingran ;
Bao, Yunxia .
APPLIED SCIENCES-BASEL, 2025, 15 (03)
[26]   Knowledge distilling based model compression and feature learning in fault diagnosis [J].
Zhang, Wenfeng ;
Biswas, Gautam ;
Zhao, Qi ;
Zhao, Hongbo ;
Feng, Wenquan .
APPLIED SOFT COMPUTING, 2020, 88
[27]   Stochastic intervention for causal inference via reinforcement learning [J].
Duong, Tri Dung ;
Li, Qian ;
Xu, Guandong .
NEUROCOMPUTING, 2022, 482 :40-49
[28]   Enhanced Graph Learning for Recommendation via Causal Inference [J].
Wang, Suhua ;
Ji, Hongjie ;
Yin, Minghao ;
Wang, Yuling ;
Lu, Mengzhu ;
Sun, Hui .
MATHEMATICS, 2022, 10 (11)
[29]   Causal Inference Meets Deep Learning: A Comprehensive Survey [J].
Jiao, Licheng ;
Wang, Yuhan ;
Liu, Xu ;
Li, Lingling ;
Liu, Fang ;
Ma, Wenping ;
Guo, Yuwei ;
Chen, Puhua ;
Yang, Shuyuan ;
Hou, Biao .
RESEARCH, 2024, 7
[30]   Targeted VAE: Variational and Targeted Learning for Causal Inference [J].
Vowels, Matthew J. ;
Camgoz, Necati Cihan ;
Bowden, Richard .
2021 IEEE INTERNATIONAL CONFERENCE ON SMART DATA SERVICES (SMDS 2021), 2021, :132-141