Reforming Architecture and Loss Function of Artificial Neural Networks in Binary Classification Problems

被引:0
作者
Dastgheib, Mohammad A. [1 ]
Raie, Abolghasem A. [1 ]
机构
[1] Amirkabir Univ Technol, Tehran Polytech, Fac Elect Engn, Tehran, Iran
来源
2020 28TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE) | 2020年
关键词
Artificial Neural Networks; Binary Classification; Classification; Dynamic Thresholding; Multiclass classification; Neural Networks;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Artificial neural networks (ANNs), implied models of the biological human brain's neuron, have now a long history as prime techniques in machine learning and computational intelligence with a wide range of applications, and are of great interest thanks to their great success. Classification is one of the most populous realms of research in ANNs with a vast and growing literature. Our innovation is to revolutionize loss function for ANNs, in accordance with a novel architecture for the last layer of the neural net (NN), that empowers them to apply dynamic thresholding while deciding the label of a sample based on its probability-of-belonging values; hence, to model complexities of the data more discriminatingly and attain better quantitative results. Although we established our approach through mathematical argument particularly for binary classification, the concept and formulation are entirely and purposefully generalizable to multiclass classification problems.
引用
收藏
页数:6
相关论文
共 13 条
[1]  
Bertsekas DP., 2016, Nonlinear Programming, V3rd, P1
[2]  
Bishop C., 2016, Pattern Recognition and Machine Learning, DOI 10.1007/978-0-387-45528-0
[3]  
Boyd S., 2018, Convex optimization
[4]  
Cao Zhu, 2019, arXiv.org
[5]  
Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1
[6]  
Kingma Diederik, 2015, ADAM MEHOD STOCHASTI
[7]  
Kukacka Golkov, 2017, REGULARIZATION DEEP
[8]  
Narayanan U., 2017, 2017 INT C EN COMM D
[9]  
Sen P. C., 2019, ADV INTELLIGENT SYST, P99
[10]  
Sergey, 2015, BATCH NORMALIZATION