Nonlinear Regression via Deep Negative Correlation Learning

被引:50
|
作者
Zhang, Le [1 ]
Shi, Zenglin [2 ]
Cheng, Ming-Ming [3 ]
Liu, Yun [3 ]
Bian, Jia-Wang [4 ]
Zhou, Joey Tianyi [1 ]
Zheng, Guoyan [5 ]
Zeng, Zeng [1 ]
机构
[1] ASTAR, Singapore 138632, Singapore
[2] Univ Amsterdam, NL-1012 WX Amsterdam, Netherlands
[3] Nankai Univ, Coll Comp Sci, TKLNDST, Nankai 300071, Peoples R China
[4] Univ Adelaide, Sch Comp Sci, Adelaide, SA 5005, Australia
[5] Shanghai Jiao Tong Univ, Sch Biomed Engn, Shanghai 200240, Peoples R China
关键词
Task analysis; Estimation; Training; Correlation; Computational modeling; Deep learning; Computer vision; deep regression; negative correlation learning; convolutional neural network; IMAGE QUALITY ASSESSMENT; HUMAN AGE ESTIMATION; SUPERRESOLUTION; ENSEMBLES; CASCADE;
D O I
10.1109/TPAMI.2019.2943860
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nonlinear regression has been extensively employed in many computer vision problems (e.g., crowd counting, age estimation, affective computing). Under the umbrella of deep learning, two common solutions exist i) transforming nonlinear regression to a robust loss function which is jointly optimizable with the deep convolutional network, and ii) utilizing ensemble of deep networks. Although some improved performance is achieved, the former may be lacking due to the intrinsic limitation of choosing a single hypothesis and the latter may suffer from much larger computational complexity. To cope with those issues, we propose to regress via an efficient "divide and conquer" manner. The core of our approach is the generalization of negative correlation learning that has been shown, both theoretically and empirically, to work well for non-deep regression problems. Without extra parameters, the proposed method controls the bias-variance-covariance trade-off systematically and usually yields a deep regression ensemble where each base model is both "accurate" and "diversified." Moreover, we show that each sub-problem in the proposed method has less Rademacher Complexity and thus is easier to optimize. Extensive experiments on several diverse and challenging tasks including crowd counting, personality analysis, age estimation, and image super-resolution demonstrate the superiority over challenging baselines as well as the versatility of the proposed method. The source code and trained models are available on our project page: https://mmcheng.net/dncl/.
引用
收藏
页码:982 / 998
页数:17
相关论文
共 50 条
  • [1] Ensemble learning via negative correlation
    Liu, Y
    Yao, X
    NEURAL NETWORKS, 1999, 12 (10) : 1399 - 1404
  • [2] Deep negative correlation classification
    Zhang, Le
    Hou, Qibin
    Liu, Yun
    Bian, Jia-Wang
    Xu, Xun
    Zhou, Joey Tianyi
    Zhu, Ce
    MACHINE LEARNING, 2024, 113 (10) : 7223 - 7241
  • [3] A hybrid ensemble method with negative correlation learning for regression
    Bai, Yun
    Tian, Ganglin
    Kang, Yanfei
    Jia, Suling
    MACHINE LEARNING, 2023, 112 (10) : 3881 - 3916
  • [4] A hybrid ensemble method with negative correlation learning for regression
    Yun Bai
    Ganglin Tian
    Yanfei Kang
    Suling Jia
    Machine Learning, 2023, 112 : 3881 - 3916
  • [5] A selective deep stacked denoising autoencoders ensemble with negative correlation learning for gearbox fault diagnosis
    Yu, Jianbo
    COMPUTERS IN INDUSTRY, 2019, 108 : 62 - 72
  • [6] Asymmetric Effects of Different Training-Testing Mismatch Types on Myoelectric Regression via Deep Learning
    Becman, Eric Cito
    Driemeier, Larissa
    Levin, Oron
    Swinnen, Stephan
    Forner-Cordero, Arturo
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (04) : 1857 - 1868
  • [7] Autofocus Measurement for Electronic Components Using Deep Regression
    Reynoso Farnes, Saul A.
    Tsai, Du-Ming
    Chiu, Wei-Yao
    IEEE TRANSACTIONS ON COMPONENTS PACKAGING AND MANUFACTURING TECHNOLOGY, 2021, 11 (04): : 697 - 707
  • [8] Research Ideas Discovery via Hierarchical Negative Correlation
    Chen, Lyuzhou
    Wang, Xiangyu
    Ban, Taiyu
    Usman, Muhammad
    Liu, Shikang
    Lyu, Derui
    Chen, Huanhuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 1639 - 1650
  • [9] Semisupervised Negative Correlation Learning
    Chen, Huanhuan
    Jiang, Bingbing
    Yao, Xin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) : 5366 - 5379
  • [10] Modularizing Deep Learning via Pairwise Learning With Kernels
    Duan, Shiyu
    Yu, Shujian
    Principe, Jose C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (04) : 1441 - 1451