Fast Multi-label Learning via Hashing

被引:0
|
作者
Hu, Haifeng [1 ]
Sun, Yong [1 ]
Wu, Jiansheng [2 ,3 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Telecommun & Informat Engn, Nanjing, Jiangsu, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Nanjing, Jiangsu, Peoples R China
[3] Arizona State Univ, Sch Comp Informat & Decis Syst Engn, Ind Engn, Tempe, AZ USA
来源
KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2015 | 2015年 / 9403卷
关键词
Multi-label Learning; Fast; Hashing; Label dependency;
D O I
10.1007/978-3-319-25159-2_48
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-label learning (MLL) copes with the classification problems where each in-stance can be tagged with multiple labels simultaneously. During the last several years, many MLL algorithms were proposed and they achieved excellent performance in multiple applications. However, these approaches are usually time-consuming and cannot handle large-scale data. In this paper, we propose a fast multi-label learning algorithm HashMLL based on hashing schemes. The approach HashMLL takes advantage of a Locality Sensitive Hashing (LSH) to identify its neighboring instances for each unseen instance, and exploits label correlation by estimating the similarity of labels through a minwise independent permutations locality sensitive hashing (MinHash). After that, relied on statistical information attained from all related labels of the neighboring instances, maxi-mum a posteriori (MAP) principle is used to determine the label set for each unseen instance. Experiments show that the performance of HashMLL is highly competitive to state-of-the-art techniques, whereas its time cost is much less. Particularly, on the dataset NUS-WIDE with 269,648 instances and the dataset Flickr with 565,444 instances where none of existing methods can return results in 24 hours, HashMLL takes only 90 secs and 23266 secs respectively.
引用
收藏
页码:535 / 546
页数:12
相关论文
共 50 条
  • [1] Adversarial Multi-Label Variational Hashing
    Lu, Jiwen
    Liong, Venice Erin
    Tan, Yap-Peng
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 332 - 344
  • [2] A FRAMEWORK OF HASHING FOR MULTI-INSTANCE MULTI-LABEL LEARNING
    Liu, Man
    Xu, Xinshun
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2015, 11 (03): : 921 - 934
  • [3] Multi-label Learning via Codewords
    Sedghi, Mahlagha
    Huang, Yinjie
    Georgiopoulos, Michael
    Anagnostopoulos, Georgios
    2018 IEEE 30TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI), 2018, : 221 - 228
  • [4] Multi-Label Deep Sparse Hashing
    Liong, Venice Erin
    Lu, Jiwen
    Tan, Yap-Peng
    2018 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (IEEE VCIP), 2018,
  • [5] Fast Multi-Instance Multi-Label Learning
    Huang, Sheng-Jun
    Gao, Wei
    Zhou, Zhi-Hua
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (11) : 2614 - 2627
  • [6] Deep hashing for multi-label image retrieval: a survey
    Josiane Rodrigues
    Marco Cristo
    Juan G. Colonna
    Artificial Intelligence Review, 2020, 53 : 5261 - 5307
  • [7] Deep hashing for multi-label image retrieval: a survey
    Rodrigues, Josiane
    Cristo, Marco
    Colonna, Juan G.
    ARTIFICIAL INTELLIGENCE REVIEW, 2020, 53 (07) : 5261 - 5307
  • [8] Multi-label Learning via Supervised Autoencoder
    Lian, Siming
    Liu, Jianwei
    Lu, Runkun
    Luo, Xionglin
    2018 37TH CHINESE CONTROL CONFERENCE (CCC), 2018, : 9416 - 9421
  • [9] Partial Multi-Label Learning via Credible Label Elicitation
    Zhang, Min-Ling
    Fang, Jun-Peng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (10) : 3587 - 3599
  • [10] Partial Multi-Label Learning via Exploiting Instance and Label Correlations
    Liang, Weichao
    Gao, Guangliang
    Chen, Lei
    Wang, Youquan
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 19 (01)