MATRIX SMOOTHING: A REGULARIZATION FOR DNN WITH TRANSITION MATRIX UNDER NOISY LABELS

被引:0
|
作者
Lv, Xianbin [1 ,2 ]
Wu, Dongxian [1 ,2 ]
Xia, Shu-Tao [1 ,2 ]
机构
[1] Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Shenzhen, Peoples R China
[2] Peng Cheng Lab, PCL Res Ctr Networks & Commun, Shenzhen, Peoples R China
来源
2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME) | 2020年
基金
中国国家自然科学基金;
关键词
noisy labels; deep learning; robustness;
D O I
10.1109/icme46284.2020.9102853
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Training deep neural networks (DNNs) in the presence of noisy labels is an important and challenging task. Probabilistic modeling, which consists of a classifier and a transition matrix, depicts the transformation from true labels to noisy labels and is a promising approach. However, recent probabilistic methods directly apply transition matrix to DNN, neglect DNN's susceptibility to overfitting, and achieve unsatisfactory performance, especially under the uniform noise. In this paper, inspired by label smoothing, we proposed a novel method, in which a smoothed transition matrix is used for updating DNN, to restrict the overfitting of DNN in probabilistic modeling. Our method is termed Matrix Smoothing. We also empirically demonstrate that our method not only improves the robustness of probabilistic modeling significantly, but also even obtains a better estimation of the transition matrix.
引用
收藏
页数:6
相关论文
empty
未找到相关数据