Deep Learning From Multiple Noisy Annotators as A Union

被引:14
|
作者
Wei, Hongxin [1 ]
Xie, Renchunzi [1 ]
Feng, Lei [2 ]
Han, Bo [3 ]
An, Bo [1 ]
机构
[1] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[2] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
[3] Hong Kong Baptist Univ, Dept Comp Sci, Hong Kong, Peoples R China
基金
中国国家自然科学基金; 新加坡国家研究基金会;
关键词
Training; Deep learning; Labeling; Noise measurement; Neural networks; Standards; Learning systems; Annotators; crowdsourcing; noisy labels; transition matrix; CLASSIFICATION;
D O I
10.1109/TNNLS.2022.3168696
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Crowdsourcing is a popular solution for large-scale data annotations. So far, various end-to-end deep learning methods have been proposed to improve the practical performance of learning from crowds. Despite their practical effectiveness, most of them have two major limitations--they do not hold learning consistency and suffer from computational inefficiency. In this article, we propose a novel method named UnionNet, which is not only theoretically consistent but also experimentally effective and efficient. Specifically, unlike existing methods that either fit a given label from each annotator independently or fuse all the labels into a reliable one, we concatenate the one-hot encoded vectors of crowdsourced labels provided by all the annotators, which takes all the labeling information as a union and coordinates multiple annotators. In this way, we can directly train an end-to-end deep neural network by maximizing the likelihood of this union with only a parametric transition matrix. We theoretically prove the learning consistency and experimentally show the effectiveness and efficiency of our proposed method.
引用
收藏
页码:10552 / 10562
页数:11
相关论文
共 50 条
  • [1] Trustable Co-Label Learning From Multiple Noisy Annotators
    Li, Shikun
    Liu, Tongliang
    Tan, Jiyong
    Zeng, Dan
    Ge, Shiming
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 1045 - 1057
  • [2] Learning from multiple annotators with varying expertise
    Yan Yan
    Rómer Rosales
    Glenn Fung
    Ramanathan Subramanian
    Jennifer Dy
    Machine Learning, 2014, 95 : 291 - 327
  • [3] Learning from multiple annotators with varying expertise
    Yan, Yan
    Rosales, Romer
    Fung, Glenn
    Subramanian, Ramanathan
    Dy, Jennifer
    MACHINE LEARNING, 2014, 95 (03) : 291 - 327
  • [4] A Convergence Path to Deep Learning on Noisy Labels
    Liu, Defu
    Tsang, Ivor W.
    Yang, Guowu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 5170 - 5182
  • [5] Learning From Noisy Labels With Deep Neural Networks: A Survey
    Song, Hwanjun
    Kim, Minseok
    Park, Dongmin
    Shin, Yooju
    Lee, Jae-Gil
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8135 - 8153
  • [6] Deep Gaussian Processes for Classification With Multiple Noisy Annotators. Application to Breast Cancer Tissue Classification
    Lopez-Perez, Miguel
    Morales-Alvarez, Pablo
    Cooper, Lee A. D.
    Molina, Rafael
    Katsaggelos, Aggelos K.
    IEEE ACCESS, 2023, 11 : 6922 - 6934
  • [7] Learning from Multiple Annotators : when Data is Hard and Annotators are Unreliable
    Wolley, Chirine
    Quafafou, Mohamed
    12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2012), 2012, : 514 - 521
  • [8] Deep Learning from Noisy Labels with Some Adjustments of a Recent Method
    Fazekas, Istvan
    Forian, Laszlo
    Barta, Attila
    INFOCOMMUNICATIONS JOURNAL, 2023, 15 : 9 - 12
  • [9] Chained Deep Learning Using Generalized Cross-Entropy for Multiple Annotators Classification
    Triana-Martinez, Jenniffer Carolina
    Gil-Gonzalez, Julian
    Fernandez-Gallego, Jose A.
    Alvarez-Meza, Andres Marino
    Castellanos-Dominguez, Cesar German
    SENSORS, 2023, 23 (07)
  • [10] Learning from multiple annotators using kernel alignment
    Gil-Gonzalez, J.
    Alvarez-Meza, A.
    Orozco-Gutierrez, A.
    PATTERN RECOGNITION LETTERS, 2018, 116 : 150 - 156