Nonlinear Regression via Deep Negative Correlation Learning

被引:50
|
作者
Zhang, Le [1 ]
Shi, Zenglin [2 ]
Cheng, Ming-Ming [3 ]
Liu, Yun [3 ]
Bian, Jia-Wang [4 ]
Zhou, Joey Tianyi [1 ]
Zheng, Guoyan [5 ]
Zeng, Zeng [1 ]
机构
[1] ASTAR, Singapore 138632, Singapore
[2] Univ Amsterdam, NL-1012 WX Amsterdam, Netherlands
[3] Nankai Univ, Coll Comp Sci, TKLNDST, Nankai 300071, Peoples R China
[4] Univ Adelaide, Sch Comp Sci, Adelaide, SA 5005, Australia
[5] Shanghai Jiao Tong Univ, Sch Biomed Engn, Shanghai 200240, Peoples R China
关键词
Task analysis; Estimation; Training; Correlation; Computational modeling; Deep learning; Computer vision; deep regression; negative correlation learning; convolutional neural network; IMAGE QUALITY ASSESSMENT; HUMAN AGE ESTIMATION; SUPERRESOLUTION; ENSEMBLES; CASCADE;
D O I
10.1109/TPAMI.2019.2943860
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nonlinear regression has been extensively employed in many computer vision problems (e.g., crowd counting, age estimation, affective computing). Under the umbrella of deep learning, two common solutions exist i) transforming nonlinear regression to a robust loss function which is jointly optimizable with the deep convolutional network, and ii) utilizing ensemble of deep networks. Although some improved performance is achieved, the former may be lacking due to the intrinsic limitation of choosing a single hypothesis and the latter may suffer from much larger computational complexity. To cope with those issues, we propose to regress via an efficient "divide and conquer" manner. The core of our approach is the generalization of negative correlation learning that has been shown, both theoretically and empirically, to work well for non-deep regression problems. Without extra parameters, the proposed method controls the bias-variance-covariance trade-off systematically and usually yields a deep regression ensemble where each base model is both "accurate" and "diversified." Moreover, we show that each sub-problem in the proposed method has less Rademacher Complexity and thus is easier to optimize. Extensive experiments on several diverse and challenging tasks including crowd counting, personality analysis, age estimation, and image super-resolution demonstrate the superiority over challenging baselines as well as the versatility of the proposed method. The source code and trained models are available on our project page: https://mmcheng.net/dncl/.
引用
收藏
页码:982 / 998
页数:17
相关论文
共 50 条
  • [21] A Deep Learning Track Correlation Method
    Cui Y.-Q.
    He Y.
    Tang T.-T.
    Xiong W.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2022, 50 (03): : 759 - 763
  • [22] Negative Selection in Negative Correlation Learning
    Liu, Yong
    2016 12TH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION, FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY (ICNC-FSKD), 2016, : 41 - 45
  • [23] Automatic First Arrival Picking via Deep Learning With Human Interactive Learning
    Tsai, Kuo Chun
    Hu, Wenyi
    Wu, Xuqing
    Chen, Jiefu
    Han, Zhu
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (02): : 1380 - 1391
  • [24] Greedy deep stochastic configuration networks ensemble with boosting negative correlation learning
    Zhang, Chenglong
    Wang, Yang
    Zhang, David
    INFORMATION SCIENCES, 2024, 680
  • [25] Date Fruit Sorting Based on Deep Learning and Discriminant Correlation Analysis
    Aiadi, Oussama
    Khaldi, Belal
    Kherfi, Mohammed Lamine
    Mekhalfi, Mohamed Lamine
    Alharbi, Abdullah
    IEEE ACCESS, 2022, 10 : 79655 - 79668
  • [26] Regression-Based Estimation of Silicone Rubber Hydrophobicity Via Deep Learning
    Ozdemir, Idris
    Uckol, Halil Ibrahim
    Ilhan, Suat
    2024 IEEE INTERNATIONAL CONFERENCE ON HIGH VOLTAGE ENGINEERING AND APPLICATIONS, ICHVE 2024, 2024,
  • [27] Global Negative Correlation Learning: A Unified Framework for Global Optimization of Ensemble Models
    Perales-Gonzalez, Carlos
    Fernandez-Navarro, Francisco
    Carbonero-Ruz, Mariano
    Perez-Rodriguez, Javier
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (08) : 4031 - 4042
  • [28] MoCo: Fuzzing Deep Learning Libraries via Assembling Code
    Ji, Pin
    Feng, Yang
    Wu, Duo
    Yan, Lingyue
    Chen, Penglin
    Liu, Jia
    Zhao, Zhihong
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 2025, 51 (02) : 371 - 388
  • [29] Multimedia super-resolution via deep learning: A survey
    Hayat, Khizar
    DIGITAL SIGNAL PROCESSING, 2018, 81 : 198 - 217
  • [30] Deep Learning and Symbolic Regression for Discovering Parametric Equations
    Zhang, Michael
    Kim, Samuel
    Lu, Peter Y.
    Soljacic, Marin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16775 - 16787