A multi-instance ensemble learning model based on concept lattice

被引:18
作者
Kang, Xiangping [1 ,2 ]
Li, Deyu [1 ,2 ]
Wang, Suge [1 ,2 ,3 ]
机构
[1] Shanxi Univ, Sch Comp & Informat Technol, Taiyuan 030006, Shanxi, Peoples R China
[2] Minist Educ, Key Lab Computat Intelligence & Chinese Informat, Taiyuan 030006, Peoples R China
[3] Shanxi Univ, Sch Math Sci, Taiyuan 030006, Shanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
lozenge-Concept lattice; Multi-instance learning; Local target feature set; Ensemble learning; Content-based image retrieval;
D O I
10.1016/j.knosys.2011.05.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces concept lattice and ensemble learning technique into multi-instance learning, and proposes the multi-instance ensemble learning model based on concept lattice which can be applied to content-based image retrieval, etc. In this model, a lozenge-concept lattice is built based on training set firstly. Because bags rather than instances in bags will serve as objects of formal context in the process of building lozenge-concept lattice, the corresponding time complexity and space complexity can be effectively descend to a certain extent; Secondly, the multi-instance learning problem is divided into multiple local multi-instance learning problems based on lozenge-concept lattice, and local target feature sets are found further in each local multi-instance learning problem. Finally, the whole training set can be classified almost correctly by ensemble of multiple local target feature sets. Through precise theorization and extensive experimentation, it proves that the method is effective. Conclusions of this paper not only help to understand multi-instance learning better from the prospective of concept lattice, but also provide a new theoretical basis for data analysis and processing. Crown Copyright (C) 2011 Published by Elsevier B.V. All rights reserved.
引用
收藏
页码:1203 / 1213
页数:11
相关论文
共 50 条
[21]   Diversified dictionaries for multi-instance learning [J].
Qiao, Maoying ;
Liu, Liu ;
Yu, Jun ;
Xu, Chang ;
Tao, Dacheng .
PATTERN RECOGNITION, 2017, 64 :407-416
[22]   Multi-Instance Learning with Incremental Classes [J].
Wei X. ;
Xu S. ;
An P. ;
Yang J. .
Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2022, 59 (08) :1723-1731
[23]   Multi-Instance Nonparallel Tube Learning [J].
Xiao, Yanshan ;
Liu, Bo ;
Hao, Zhifeng .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) :2563-2577
[24]   Feature Selection in Multi-instance Learning [J].
Zhang, Chun-Hua ;
Tan, Jun-Yan ;
Deng, Nai-Yang .
OPERATIONS RESEARCH AND ITS APPLICATIONS, 2010, 12 :462-+
[25]   Multi-instance clustering with applications to multi-instance prediction [J].
Zhang, Min-Ling ;
Zhou, Zhi-Hua .
APPLIED INTELLIGENCE, 2009, 31 (01) :47-68
[26]   Feature selection in multi-instance learning [J].
Gan, Rui ;
Yin, Jian .
NEURAL COMPUTING & APPLICATIONS, 2013, 23 (3-4) :907-912
[27]   Adapting RBF Neural Networks to Multi-Instance Learning [J].
Min-Ling Zhang ;
Zhi-Hua Zhou .
Neural Processing Letters, 2006, 23 :1-26
[28]   Adapting RBF neural networks to multi-instance learning [J].
Zhang, ML ;
Zhou, ZH .
NEURAL PROCESSING LETTERS, 2006, 23 (01) :1-26
[29]   Generalized attention-based deep multi-instance learning [J].
Zhao, Lu ;
Yuan, Liming ;
Hao, Kun ;
Wen, Xianbin .
MULTIMEDIA SYSTEMS, 2023, 29 (01) :275-287
[30]   Generalized attention-based deep multi-instance learning [J].
Lu Zhao ;
Liming Yuan ;
Kun Hao ;
Xianbin Wen .
Multimedia Systems, 2023, 29 :275-287