FAM3L: Feature-Aware Multi-Modal Metric Learning for Integrative Survival Analysis of Human Cancers

被引:7
|
作者
Shao, Wei [1 ,2 ]
Liu, Jianxin [1 ]
Zuo, Yingli [1 ]
Qi, Shile [1 ]
Hong, Honghai [3 ]
Sheng, Jianpeng [4 ]
Zhu, Qi [1 ]
Zhang, Daoqiang [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 210095, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, Shenzhen Res Inst, Nanjing 210095, Peoples R China
[3] Guangzhou Med Univ, Affiliated Hosp 3, Dept Clin Lab, Guangzhou 510150, Peoples R China
[4] Zhejiang Univ, Affiliated Hosp 1, Sch Med, Hangzhou 310030, Peoples R China
基金
中国国家自然科学基金;
关键词
Measurement; Cancer; Genomics; Bioinformatics; Feature extraction; Imaging; Prognostics and health management; Imaging genomics; feature-aware metric learning; Hilbert-Schmidt independence criterion; prognostic bio-marker identification; GENE; EXPRESSION; CELLS; MODEL;
D O I
10.1109/TMI.2023.3262024
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Survival analysis is to estimate the survival time for an individual or a group of patients, which is a valid solution for cancer treatments. Recent studies suggested that the integrative analysis of histopathological images and genomic data can better predict the survival of cancer patients than simply using single bio-marker, for different bio-markers may provide complementary information. However, for the given multi-modal data that may contain irrelevant or redundant features, it is still challenge to design a distance metric that can simultaneously discover significant features and measure the difference of survival time among different patients. To solve this issue, we propose a Feature-Aware Multi-modal Metric Learning method (FAM3L), which not only learns the metric for distance constraints on patients' survival time, but also identifies important images and genomic features for survival analysis. Specifically, for each modality of data, we firstly design one feature-aware metric that can be decoupled into a traditional distance metric and a diagonal weight for important feature identification. Then, in order to explore the complex correlation across multiple modality data, we apply Hilbert-Schmidt Independence Criterion (HSIC) to jointly learn multiple metrics. Finally, based on the learned distance metrics, we apply the Cox proportional hazards model for prognosis prediction. We evaluate the performance of our proposed FAM3L method on three cancer cohorts derived from The Cancer Genome Atlas (TCGA), the experimental results demonstrate that our method can not only achieve superior performance for cancer prognosis, but also identify meaningful image and genomic features correlating strongly with cancer survival.
引用
收藏
页码:2552 / 2565
页数:14
相关论文
共 4 条
  • [1] M5L: Multi-Modal Multi-Margin Metric Learning for RGBT Tracking
    Tu, Zhengzheng
    Lin, Chun
    Zhao, Wei
    Li, Chenglong
    Tang, Jin
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 85 - 98
  • [2] Multi-task multi-modal learning for joint diagnosis and prognosis of human cancers
    Shao, Wei
    Wang, Tongxin
    Sun, Liang
    Dong, Tianhan
    Han, Zhi
    Huang, Zhi
    Zhang, Jie
    Zhang, Daoqiang
    Huang, Kun
    MEDICAL IMAGE ANALYSIS, 2020, 65 (65)
  • [3] Multi-Channel 3D Deep Feature Learning for Survival Time Prediction of Brain Tumor Patients Using Multi-Modal Neuroimages
    Nie, Dong
    Lu, Junfeng
    Zhang, Han
    Adeli, Ehsan
    Wang, Jun
    Yu, Zhengda
    Liu, Luyan
    Wang, Qian
    Wu, Jinsong
    Shen, Dinggang
    SCIENTIFIC REPORTS, 2019, 9 (1)
  • [4] An Innovative and Efficient Diagnostic Prediction Flow for Head and Neck Cancer: A Deep Learning Approach for Multi-Modal Survival Analysis Prediction Based on Text and Multi-Center PET/CT Images
    Wang, Zhaonian
    Zheng, Chundan
    Han, Xu
    Chen, Wufan
    Lu, Lijun
    DIAGNOSTICS, 2024, 14 (04)