ADAPTIVE FEATURE SPLIT SELECTION FOR CO-TRAINING: APPLICATION TO TIRE IRREGULAR WEAR CLASSIFICATION

被引:0
|
作者
Du, Wei [1 ]
Phlypo, Ronald [1 ]
Adali, Tuelay [1 ]
机构
[1] Univ Maryland Baltimore Cty, Dept CSEE, Baltimore, MD 21250 USA
来源
2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) | 2013年
关键词
Co-training; semi-supervised classification; feature splits; DCA; LTM tire data; INDEPENDENT COMPONENT ANALYSIS; FRAMEWORK; CRITERIA;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Co-training is a practical and powerful semi-supervised learning method. It yields high classification accuracy with a training data set containing only a small set of labeled data. Successful performance in co-training requires two important conditions on the features: diversity and sufficiency. In this paper, we propose a novel mutual information (MI) based approach inspired by the idea of dependent component analysis (DCA) to achieve feature splits that are maximally independent between-subsets (diversity) or within-subsets (sufficiency). We evaluate the relationship between the classification performance and the relative importance of the two conditions. Experimental results on actual tire data indicate that compared to diversity, sufficiency has a more significant impact on their classification accuracy. Further results show that co-training with feature splits obtained by the MI-based approach yields higher accuracy than supervised classification and significantly higher when using a small set of labeled training data.
引用
收藏
页码:3497 / 3501
页数:5
相关论文
共 50 条
  • [1] Adaptive Co-Training SVM for Sentiment Classification on Tweets
    Liu, Shenghua
    Li, Fuxin
    Li, Fangtao
    Cheng, Xueqi
    Shen, Huawei
    PROCEEDINGS OF THE 22ND ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM'13), 2013, : 2079 - 2088
  • [2] DCPE co-training for classification
    Xu, Jin
    He, Haibo
    Man, Hong
    NEUROCOMPUTING, 2012, 86 : 75 - 85
  • [3] Co-training with a single natural feature set applied to email classification
    Chan, J
    Koprinska, I
    Poon, J
    IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE (WI 2004), PROCEEDINGS, 2004, : 586 - 589
  • [4] Product feature extraction with co-training
    Wu, Xing
    He, Zhongshi
    Huang, Yongwen
    Journal of Information and Computational Science, 2009, 6 (03): : 1515 - 1523
  • [5] Co-training of Feature Extraction and Classification using Partitioned Convolutional Neural Networks
    Tsai, Wei-Yu
    Choi, Jinhang
    Parija, Tulika
    Gomatam, Priyanka
    Das, Chita
    Sampson, John
    Narayanan, Vijaykrishnan
    PROCEEDINGS OF THE 2017 54TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2017,
  • [6] Evaluation criteria of feature splits for co-training
    Terabe, Masahiro
    Hashimoto, Kazuo
    IMECS 2008: INTERNATIONAL MULTICONFERENCE OF ENGINEERS AND COMPUTER SCIENTISTS, VOLS I AND II, 2008, : 540 - 544
  • [7] SEMI-SUPERVISED PYRAMID FEATURE CO-TRAINING NETWORK FOR LIDAR DATA CLASSIFICATION
    Wang, Zexin
    Wang, Haoran
    Jiao, Licheng
    Liu, Xu
    2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2019), 2019, : 2471 - 2474
  • [8] Vertical Ensemble Co-Training for Text Classification
    Katz, Gilad
    Caragea, Cornelia
    Shabtai, Asaf
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2018, 9 (02)
  • [9] Co-Training with Adaptive Bayesian Classifier Combination
    Yaslan, Yusuf
    Cataltepe, Zehra
    23RD INTERNATIONAL SYMPOSIUM ON COMPUTER AND INFORMATION SCIENCES, 2008, : 620 - 623
  • [10] Classification of Online Medical Discourse by Modified Co-training
    Alnashwan, Rana
    Sorensen, Humphrey
    O'Riordan, Adrian
    2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON BIG DATA COMPUTING SERVICE AND APPLICATIONS (IEEE BIGDATASERVICE 2019), 2019, : 131 - 137