Fair and Interpretable Models for Survival Analysis

被引:11
作者
Rahman, Md Mahmudur [1 ]
Purushotham, Sanjay [1 ]
机构
[1] Univ Maryland Baltimore Cty, Baltimore, MD 21228 USA
来源
PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022 | 2022年
基金
美国国家科学基金会;
关键词
Survival analysis; Fairness; Interpretability; Neural networks; Pseudo values; Censoring;
D O I
10.1145/3534678.3539259
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Survival analysis aims to predict the risk of an event, such as death due to cancer, in the presence of censoring. Recent research has shown that existing survival techniques are prone to unintentional biases towards protected attributes such as age, race, and/or gender. For example, censoring assumed to be unrelated to the prognosis and covariates (typically violated in real data) often leads to overestimation and biased survival predictions for different protected groups. In order to attenuate harmful bias and ensure fair survival predictions, we introduce fairness definitions based on survival functions and censoring. We propose novel fair and interpretable survival models which use pseudo valued-based objective functions with fairness definitions as constraints for predicting subject-specific survival probabilities. Experiments on three real-world survival datasets demonstrate that our proposed fair survival models show significant improvement over existing survival techniques in terms of accuracy and fairness measures. We show that our proposed models provide fair predictions for protected attributes under different types and amounts of censoring. Furthermore, we study the interplay between interpretability and fairness; and investigate how fairness and censoring impact survival predictions for different protected attributes.
引用
收藏
页码:1452 / 1462
页数:11
相关论文
共 36 条
  • [1] Agarwal R., 2020, ARXIV200413912
  • [2] Pseudo-observations in survival analysis
    Andersen, Per Kragh
    Perme, Maja Pohar
    [J]. STATISTICAL METHODS IN MEDICAL RESEARCH, 2010, 19 (01) : 71 - 99
  • [3] Angwin J, 2019, Machine bias: there's software used across the country to predict future criminals. and it's biased against blacks 2016
  • [4] [Anonymous], 1995, ANN INTERNAL MED
  • [5] Antolini Laura, 2005, STAT MED
  • [6] Barrajon Enrique, 2020, ARXIV201208649
  • [7] Bolukbasi T., 2016, NEURIPS
  • [8] Chen Irene Y., 2019, AMA journal of ethics
  • [9] Cox D., 1972, J Royal Stat Soc - Series B
  • [10] Dispenzieri Angela., 2012, MAYO CLIN P