Deep Convolutional Neural Networks for Multi-Instance Multi-Task Learning

被引:36
|
作者
Zeng, Tao [1 ]
Ji, Shuiwang [1 ]
机构
[1] Washington State Univ, Sch Elect Engn & Comp Sci, Pullman, WA 99164 USA
来源
2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM) | 2015年
关键词
Deep learning; multi-instance learning; multi-task learning; transfer learning; bioinformatics; ANNOTATION;
D O I
10.1109/ICDM.2015.92
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-instance learning studies problems in which labels are assigned to bags that contain multiple instances. In these settings, the relations between instances and labels are usually ambiguous. In contrast, multi-task learning focuses on the output space in which an input sample is associated with multiple labels. In real world, a sample may be associated with multiple labels that are derived from observing multiple aspects of the problem. Thus many real world applications are naturally formulated as multi-instance multi-task (MIMT) problems. A common approach to MIMT is to solve it task-by-task independently under the multi-instance learning framework. On the other hand, convolutional neural networks (CNN) have demonstrated promising performance in single-instance single-label image classification tasks. However, how CNN deals with multi-instance multi-label tasks still remains an open problem. This is mainly due to the complex multiple-to-multiple relations between the input and output space. In this work, we propose a deep leaning model, known as multi-instance multi-task convolutional neural networks (MIMT-CNN), where a number of images representing a multi-task problem is taken as the inputs. Then a shared sub-CNN is connected with each input image to form instance representations. Those sub-CNN outputs are subsequently aggregated as inputs to additional convolutional layers and full connection layers to produce the ultimate multi-label predictions. This CNN model, through transfer learning from other domains, enables transfer of prior knowledge at image level learned from large single-label single-task data sets. The bag level representations in this model are hierarchically abstracted by multiple layers from instance level representations. Experimental results on mouse brain gene expression pattern annotation data show that the proposed MIMT-CNN model achieves superior performance.
引用
收藏
页码:579 / 588
页数:10
相关论文
共 50 条
  • [1] A Prototype Learning Based Multi-Instance Convolutional Neural Network
    He K.-L.
    Shi Y.-H.
    Gao Y.
    Huo J.
    Wang D.
    Zhang Y.
    Gao, Yang (gaoy@nju.edu.cn), 2017, Science Press (40): : 1265 - 1274
  • [2] Ensemble of multi-task deep convolutional neural networks using transfer learning for fruit freshness classification
    Kang, Jaeyong
    Gwak, Jeonghwan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (16) : 22355 - 22377
  • [3] Ensemble of multi-task deep convolutional neural networks using transfer learning for fruit freshness classification
    Jaeyong Kang
    Jeonghwan Gwak
    Multimedia Tools and Applications, 2022, 81 : 22355 - 22377
  • [4] Multi-Task Learning for Food Identification and Analysis with Deep Convolutional Neural Networks
    Xi-Jin Zhang
    Yi-Fan Lu
    Song-Hai Zhang
    Journal of Computer Science and Technology, 2016, 31 : 489 - 500
  • [5] CNN based Multi-Instance Multi-Task Learning for Syndrome Differentiation of Diabetic Patients
    Wang, Zeyuan
    Poon, Josiah
    Sun, Shiding
    Poon, Simon
    PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 1905 - 1911
  • [6] Multi-Task Learning for Food Identification and Analysis with Deep Convolutional Neural Networks
    Zhang, Xi-Jin
    Lu, Yi-Fan
    Zhang, Song-Hai
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2016, 31 (03) : 489 - 500
  • [7] Multi-Adaptive Optimization for multi-task learning with deep neural networks
    Hervella, alvaro S.
    Rouco, Jose
    Novo, Jorge
    Ortega, Marcos
    NEURAL NETWORKS, 2024, 170 : 254 - 265
  • [8] Adapting RBF Neural Networks to Multi-Instance Learning
    Min-Ling Zhang
    Zhi-Hua Zhou
    Neural Processing Letters, 2006, 23 : 1 - 26
  • [9] Cell tracking using deep neural networks with multi-task learning
    He, Tao
    Mao, Hua
    Guo, Jixiang
    Yi, Zhang
    IMAGE AND VISION COMPUTING, 2017, 60 : 142 - 153
  • [10] MULTI-TASK LEARNING FOR SEGMENTATION OF BUILDING FOOTPRINTS WITH DEEP NEURAL NETWORKS
    Bischke, Benjamin
    Helber, Patrick
    Folz, Joachim
    Borth, Damian
    Dengel, Andreas
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 1480 - 1484