Ensemble Kernel Mean Matching

被引:10
|
作者
Miao, Yun-Qian [1 ]
Farahat, Ahmed K. [1 ]
Kamel, Mohamed S. [1 ]
机构
[1] Univ Waterloo, Waterloo, ON N2L 3G1, Canada
来源
2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM) | 2015年
关键词
Density ratio estimation; Kernel mean matching; Ensemble method; Distributed algorithm;
D O I
10.1109/ICDM.2015.127
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Kernel Mean Matching (KMM) is an elegant algorithm that produces density ratios between training and test data by minimizing their maximum mean discrepancy in a kernel space. The applicability of KMM to large-scale problems is however hindered by the quadratic complexity of calculating and storing the kernel matrices over training and test data. To address this problem, this paper proposes a novel ensemble algorithm for KMM, which divides test samples into smaller partitions, estimates a density ratio for each partition and then fuses these local estimates with a weighted sum. Our theoretical analysis shows that the ensemble KMM has a lower error bound than the centralized KMM, which uses all the test data at once to estimate the density ratio. Considering its suitability for distributed implementation, the proposed algorithm is also favorable in terms of time and space complexities. Experiments on benchmark datasets confirm the superiority of the proposed algorithm in terms of estimation accuracy and running time.
引用
收藏
页码:330 / 338
页数:9
相关论文
共 50 条
  • [21] Bagging based ensemble transfer learning
    Xiaobo Liu
    Guangjun Wang
    Zhihua Cai
    Harry Zhang
    Journal of Ambient Intelligence and Humanized Computing, 2016, 7 : 29 - 36
  • [22] On the Feasibility of Distributed Kernel Regression for Big Data
    Xu, Chen
    Zhang, Yongquan
    Li, Runze
    Wu, Xindong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (11) : 3041 - 3052
  • [23] An effective ensemble method for missing data imputation
    Baruah, Bikash
    Dutta, Manash P.
    Bhattacharyya, Dhruba K.
    INTERNATIONAL JOURNAL OF INFORMATION AND COMPUTER SECURITY, 2023, 20 (3-4) : 295 - 314
  • [24] Ensemble of fast learning stochastic gradient boosting
    Li, Bin
    Yu, Qingzhao
    Peng, Lu
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2022, 51 (01) : 40 - 52
  • [25] Emotion Recognition on Multimodal with Deep Learning and Ensemble
    Dharma, David Adi
    Zahra, Amalia
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (12) : 656 - 663
  • [26] Confidence in Prediction: An Approach for Dynamic Weighted Ensemble
    Duc Thuan Do
    Tien Thanh Nguyen
    The Trung Nguyen
    Anh Vu Luong
    Liew, Alan Wee-Chung
    McCall, John
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS (ACIIDS 2020), PT I, 2020, 12033 : 358 - 370
  • [27] ENSEMBLE MODELS FOR PREDICTING WARTS TREATMENT METHODS
    Bajeh, Amos O.
    Adeleke, Hammid O.
    Mojeed, Hammed A.
    Balogun, Abdullateef O.
    Abikoye, Oluwakemi C.
    Usman-Hamza, Fatima E.
    JOURNAL OF ENGINEERING SCIENCE AND TECHNOLOGY, 2021, 16 (02): : 1030 - 1052
  • [28] Evolutionary shift detection with ensemble variable selection
    Zhang, Wensha
    Kenney, Toby
    Ho, Lam Si Tung
    BMC ECOLOGY AND EVOLUTION, 2024, 24 (01):
  • [29] Performance Metric Ensemble for Multiobjective Evolutionary Algorithms
    Yen, Gary G.
    He, Zhenan
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2014, 18 (01) : 131 - 144
  • [30] An Ensemble Method of CNN Models for Object Detection
    Lee, Jinsu
    Lee, Sang-Kwang
    Yang, Seong-Il
    2018 INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY CONVERGENCE (ICTC), 2018, : 898 - 901