Tackle balancing constraints in semi-supervised ordinal regression

被引:0
作者
Chenkang Zhang
Heng Huang
Bin Gu
机构
[1] China Mobile (Suzhou) Software Technology Company Limited,Department of Computer Science
[2] University of Maryland,School of Artificial Intelligence
[3] Jilin University,Department of Machine Learning
[4] Mohamed bin Zayed University of Artificial Intelligence,undefined
来源
Machine Learning | 2024年 / 113卷
关键词
Semi-supervised learning; Ordinal regression; Balancing constraint;
D O I
暂无
中图分类号
学科分类号
摘要
Semi-supervised ordinal regression (S2OR) has been recognized as a valuable technique to improve the performance of the ordinal regression (OR) model by leveraging available unlabeled samples. The balancing constraint is a useful approach for semi-supervised algorithms, as it can prevent the trivial solution of classifying a large number of unlabeled examples into a few classes. However, rapid training of the S2OR model with balancing constraints is still an open problem due to the difficulty in formulating and solving the corresponding optimization objective. To tackle this issue, we propose a novel form of balancing constraints and extend the traditional convex–concave procedure (CCCP) approach to solve our objective function. Additionally, we transform the convex inner loop (CIL) problem generated by the CCCP approach into a quadratic problem that resembles support vector machine, where multiple equality constraints are treated as virtual samples. As a result, we can utilize the existing fast solver to efficiently solve the CIL problem. Experimental results conducted on several benchmark and real-world datasets not only validate the effectiveness of our proposed algorithm but also demonstrate its superior performance compared to other supervised and semi-supervised algorithms
引用
收藏
页码:2575 / 2595
页数:20
相关论文
共 50 条
  • [21] Semi-supervised Contrastive Regression for Estimation of Eye Gaze
    Maiti, Somsukla
    Gupta, Akshansh
    PATTERN RECOGNITION AND MACHINE INTELLIGENCE, PREMI 2023, 2023, 14301 : 252 - 259
  • [22] Safe co-training for semi-supervised regression
    Liu, Liyan
    Huang, Peng
    Yu, Hong
    Min, Fan
    INTELLIGENT DATA ANALYSIS, 2023, 27 (04) : 959 - 975
  • [23] Multitask Semi-supervised Learning with Constraints and Constraint Exceptions
    Maggini, Marco
    Papini, Tiziano
    ARTIFICIAL NEURAL NETWORKS (ICANN 2010), PT III, 2010, 6354 : 218 - 227
  • [24] A multi-scheme semi-supervised regression approach
    Fazakis, Nikos
    Karlos, Stamatis
    Kotsiantis, Sotiris
    Sgarbas, Kyriakos
    PATTERN RECOGNITION LETTERS, 2019, 125 : 758 - 765
  • [25] Distributed Semi-supervised Regression Learning with Coefficient Regularization
    Qin Guo
    Results in Mathematics, 2022, 77
  • [26] Distributed Semi-supervised Regression Learning with Coefficient Regularization
    Guo, Qin
    RESULTS IN MATHEMATICS, 2022, 77 (02)
  • [27] Distributed Semi-supervised Learning with Kernel Ridge Regression
    Chang, Xiangyu
    Lin, Shao-Bo
    Zhou, Ding-Xuan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [28] Illustration of merits of semi-supervised learning in regression analysis
    Kaneko, Hiromasa
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2018, 182 : 47 - 56
  • [29] Semi-supervised estimation for the varying coefficient regression model
    Lai, Peng
    Tian, Wenxin
    Zhou, Yanqiu
    AIMS MATHEMATICS, 2024, 9 (01): : 55 - 72
  • [30] Semi-supervised trees for multi-target regression
    Levatic, Jurica
    Kocev, Dragi
    Ceci, Michelangelo
    Dzeroski, Saso
    INFORMATION SCIENCES, 2018, 450 : 109 - 127