Deep neural networks with L1 and L2 regularization for high dimensional corporate credit risk prediction

被引:29
|
作者
Yang, Mei [1 ]
Lim, Ming K. [4 ]
Qu, Yingchi [1 ]
Li, Xingzhi [3 ]
Ni, Du [2 ]
机构
[1] Chongqing Univ, Sch Econ & Business Adm, Chongqing 400030, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Sch Management, Jiangsu 210003, Peoples R China
[3] Chongqing Jiaotong Univ, Sch Econ & Management, Chongqing 400074, Peoples R China
[4] Univ Glasgow, Adam Smith Business Sch, Glasgow G14 8QQ, Scotland
关键词
High dimensional data; Credit risk; Deep neural network; Prediction; L1; regularization; SUPPORT VECTOR MACHINES; FEATURE-SELECTION; DECISION-MAKING; MODELS; CLASSIFICATION; SVM;
D O I
10.1016/j.eswa.2022.118873
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate credit risk prediction can help companies avoid bankruptcies and make adjustments ahead of time. There is a tendency in corporate credit risk prediction that more and more features are considered in the pre-diction system. However, this often brings redundant and irrelevant information which greatly impairs the performance of prediction algorithms. Therefore, this study proposes an HDNN algorithm that is an improved deep neural network (DNN) algorithm and can be used for high dimensional prediction of corporate credit risk. We firstly theoretically proved that there was no regularization effect when L1 regularization was added to the batch normalization layer of the DNN, which was a hidden rule in the industrial implementation but never been proved. In addition, we proved that adding L2 constraints on a single L1 regularization can solve the issue. Finally, this study analyzed a case study of credit data with supply chain and network data to show the supe-riority of the HDNN algorithm in the scenario of a high dimensional dataset.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS
    Li, Lin
    Fan, Qinwei
    Zhou, Li
    JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022
  • [32] RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS
    Li, Lin
    Fan, Qinwei
    Zhou, Li
    JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022
  • [33] Sparse kernel logistic regression based on L1/2 regularization
    XU Chen
    PENG ZhiMing
    JING WenFeng
    Science China(Information Sciences), 2013, 56 (04) : 75 - 90
  • [34] Sorted L1/L2 Minimization for Sparse Signal Recovery
    Wang, Chao
    Yan, Ming
    Yu, Junjie
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 99 (02)
  • [35] Improving Malaria Detection Using L1 Regularization Neural Network
    Hcini, Ghazala
    Jdey, Imen
    Ltifi, Hela
    JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2022, 28 (10) : 1087 - 1107
  • [36] A pruning algorithm with relaxed conditions for high-order neural networks based on smoothing group L1/2 regularization and adaptive momentum
    Kang, Qian
    Fan, Qinwei
    Zurada, Jacek M.
    Huang, Tingwen
    KNOWLEDGE-BASED SYSTEMS, 2022, 257
  • [37] Relationship of L1 Skills and L2 Aptitude to L2 Anxiety on the Foreign Language Classroom Anxiety Scale
    Sparks, Richard L.
    Patton, Jon
    LANGUAGE LEARNING, 2013, 63 (04) : 870 - 895
  • [38] Application of L1/2 regularization logistic method in heart disease diagnosis
    Zhang, Bowen
    Chai, Hua
    Yang, Ziyi
    Liang, Yong
    Chu, Gejin
    Liu, Xiaoying
    BIO-MEDICAL MATERIALS AND ENGINEERING, 2014, 24 (06) : 3447 - 3454
  • [39] Sparse SAR imaging based on L1/2 regularization
    ZENG JinShan
    ScienceChina(InformationSciences), 2012, 55 (08) : 1755 - 1775
  • [40] Sparse SAR imaging based on L1/2 regularization
    JinShan Zeng
    Jian Fang
    ZongBen Xu
    Science China Information Sciences, 2012, 55 : 1755 - 1775