Causal Inference with Knowledge Distilling and Curriculum Learning for Unbiased VQA

被引:27
作者
Pan, Yonghua [1 ]
Li, Zechao [1 ]
Zhang, Liyan [2 ]
Tang, Jinhui [1 ]
机构
[1] Nanjing Univ Sci & Technol, 200 Xiaolingwei St, Nanjing 210094, Jiangsu, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, 29 Yudao St, Nanjing 210016, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Visual question answering; neural networks; knowledge distillation; causal inference;
D O I
10.1145/3487042
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, many Visual Question Answering (VQA) models rely on the correlations between questions and answers yet neglect those between the visual information and the textual information. They would perform badly if the handled data distribute differently from the training data (i.e., out-of-distribution (OOD) data). Towards this end, we propose a two-stage unbiased VQA approach that addresses the unbiased issue from a causal perspective. In the causal inference stage, we mark the spurious correlation on the causal graph, explore the counterfactual causality, and devise a causal target based on the inherent correlations between the conventional and counterfactual VQA models. In the distillation stage, we introduce the causal target into the training process and leverages distilling as well as curriculum learning to capture the unbiased model. Since Causal Inference with Knowledge Distilling and Curriculum Learning (CKCL) reinforces the contribution of the visual information and eliminates the impact of the spurious correlation by distilling the knowledge in causal inference to the VQA model, it contributes to the good performance on both the standard data and out-of-distribution data. The extensive experimental results on VQA-CP v2 dataset demonstrate the superior performance of the proposed method compared to the state-of-the-art (SotA) methods.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] Unbiased Causal Inference From an Observational Study: Results of a Within-Study Comparison
    Pohl, Steffi
    Steiner, Peter M.
    Eisermann, Jens
    Soellner, Renate
    Cook, Thomas D.
    EDUCATIONAL EVALUATION AND POLICY ANALYSIS, 2009, 31 (04) : 463 - 479
  • [22] Causal ML: Python']Python package for causal inference machine learning
    Zhao, Yang
    Liu, Qing
    SOFTWAREX, 2023, 21
  • [23] Knowledge distilling based model compression and feature learning in fault diagnosis
    Zhang, Wenfeng
    Biswas, Gautam
    Zhao, Qi
    Zhao, Hongbo
    Feng, Wenquan
    APPLIED SOFT COMPUTING, 2020, 88
  • [24] Deep Knowledge Tracing Integrating Temporal Causal Inference and PINN
    Lu, Faming
    Li, Yingran
    Bao, Yunxia
    APPLIED SCIENCES-BASEL, 2025, 15 (03):
  • [25] Stochastic intervention for causal inference via reinforcement learning
    Duong, Tri Dung
    Li, Qian
    Xu, Guandong
    NEUROCOMPUTING, 2022, 482 : 40 - 49
  • [26] Enhanced Graph Learning for Recommendation via Causal Inference
    Wang, Suhua
    Ji, Hongjie
    Yin, Minghao
    Wang, Yuling
    Lu, Mengzhu
    Sun, Hui
    MATHEMATICS, 2022, 10 (11)
  • [27] Targeted VAE: Variational and Targeted Learning for Causal Inference
    Vowels, Matthew J.
    Camgoz, Necati Cihan
    Bowden, Richard
    2021 IEEE INTERNATIONAL CONFERENCE ON SMART DATA SERVICES (SMDS 2021), 2021, : 132 - 141
  • [28] CauseKG: A Framework Enhancing Causal Inference With Implicit Knowledge Deduced From Knowledge Graphs
    Huang, Hao
    Vidal, Maria-Esther
    IEEE ACCESS, 2024, 12 : 61810 - 61827
  • [29] Light-Weight Deformable Registration Using Adversarial Learning With Distilling Knowledge
    Tran, Minh Q.
    Tuong Do
    Huy Tran
    Tjiputra, Erman
    Tran, Quang D.
    Anh Nguyen
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2022, 41 (06) : 1443 - 1453
  • [30] Developing a novel causal inference algorithm for personalized biomedical causal graph learning using meta machine learning
    Wu, Hang
    Shi, Wenqi
    Wang, May D.
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2024, 24 (01)