A multi-instance ensemble learning model based on concept lattice

被引:18
作者
Kang, Xiangping [1 ,2 ]
Li, Deyu [1 ,2 ]
Wang, Suge [1 ,2 ,3 ]
机构
[1] Shanxi Univ, Sch Comp & Informat Technol, Taiyuan 030006, Shanxi, Peoples R China
[2] Minist Educ, Key Lab Computat Intelligence & Chinese Informat, Taiyuan 030006, Peoples R China
[3] Shanxi Univ, Sch Math Sci, Taiyuan 030006, Shanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
lozenge-Concept lattice; Multi-instance learning; Local target feature set; Ensemble learning; Content-based image retrieval;
D O I
10.1016/j.knosys.2011.05.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces concept lattice and ensemble learning technique into multi-instance learning, and proposes the multi-instance ensemble learning model based on concept lattice which can be applied to content-based image retrieval, etc. In this model, a lozenge-concept lattice is built based on training set firstly. Because bags rather than instances in bags will serve as objects of formal context in the process of building lozenge-concept lattice, the corresponding time complexity and space complexity can be effectively descend to a certain extent; Secondly, the multi-instance learning problem is divided into multiple local multi-instance learning problems based on lozenge-concept lattice, and local target feature sets are found further in each local multi-instance learning problem. Finally, the whole training set can be classified almost correctly by ensemble of multiple local target feature sets. Through precise theorization and extensive experimentation, it proves that the method is effective. Conclusions of this paper not only help to understand multi-instance learning better from the prospective of concept lattice, but also provide a new theoretical basis for data analysis and processing. Crown Copyright (C) 2011 Published by Elsevier B.V. All rights reserved.
引用
收藏
页码:1203 / 1213
页数:11
相关论文
共 50 条
[11]   EFFICIENT INSTANCE ANNOTATION IN MULTI-INSTANCE LEARNING [J].
Pham, Anh T. ;
Raich, Raviv ;
Fern, Xiaoli Z. .
2014 IEEE WORKSHOP ON STATISTICAL SIGNAL PROCESSING (SSP), 2014, :137-140
[12]   Multi-Instance Learning from Supervised View [J].
Zhi-Hua Zhou .
Journal of Computer Science and Technology, 2006, 21 :800-809
[13]   Multi-instance learning from supervised view [J].
Zhou, Zhi-Hua .
JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2006, 21 (05) :800-809
[14]   A multi-instance learning algorithm based on nonparallel classifier [J].
Qi, Zhiquan ;
Tian, Yingjie ;
Yu, Xiaodan ;
Shi, Yong .
APPLIED MATHEMATICS AND COMPUTATION, 2014, 241 :233-241
[15]   Saliency Detection Based on Multi-instance Images Learning [J].
Wan Shouhong ;
Jin Peiquan ;
Yue Lihua ;
Huang Qian .
SEVENTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2015), 2015, 9631
[16]   Text Representation and Classification Based on Multi-Instance Learning [J].
He Wei ;
Wang Yu .
2009 INTERNATIONAL CONFERENCE ON MANAGEMENT SCIENCE & ENGINEERING (16TH), VOLS I AND II, CONFERENCE PROCEEDINGS, 2009, :34-39
[17]   Feature selection in multi-instance learning [J].
Rui Gan ;
Jian Yin .
Neural Computing and Applications, 2013, 23 :907-912
[18]   Multi-instance multi-label learning [J].
Zhou, Zhi-Hua ;
Zhang, Min-Ling ;
Huang, Sheng-Jun ;
Li, Yu-Feng .
ARTIFICIAL INTELLIGENCE, 2012, 176 (01) :2291-2320
[19]   Regularized Instance Embedding for Deep Multi-Instance Learning [J].
Lin, Yi ;
Zhang, Honggang .
APPLIED SCIENCES-BASEL, 2020, 10 (01)
[20]   Multi-Instance Nonparallel Tube Learning [J].
Xiao, Yanshan ;
Liu, Bo ;
Hao, Zhifeng .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) :2563-2577