HIERARCHICAL MULTITASK LEARNING WITH CTC

被引:0
|
作者
Sanabria, Ramon [1 ]
Metze, Florian [1 ]
机构
[1] Carnegie Mellon Univ, Sch Comp Sci, Language Technol Inst, Pittsburgh, PA 15213 USA
关键词
hierarchical multitask learning; ASR; CTC;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In Automatic Speech Recognition, it is still challenging to learn useful intermediate representations when using high-level (or abstract) target units such as words. For that reason, when only a few hundreds of hours of training data are available, character or phoneme-based systems tend to outperform word-based systems. In this paper, we show how Hierarchical Multitask Learning can encourage the formation of useful intermediate representations. We achieve this by performing Connectionist Temporal Classification at different levels of the network with targets of different granularity. Our model thus performs predictions in multiple scales for the same input. On the standard 300h Switchboard training setup, our hierarchical multitask architecture demonstrates improvements over singletask architectures with the same number of parameters. Our model obtains 14.0% Word Error Rate on the Switchboard subset of the Eval2000 test set without any decoder or language model, outperforming the current state-of-the-art on non-autoregressive Acoustic-to-Word models.
引用
收藏
页码:485 / 490
页数:6
相关论文
共 50 条
  • [21] Relational Feature Mining with Hierarchical Multitask kFOIL
    Cilia, Elisa
    Landwehr, Niels
    Passerini, Andrea
    FUNDAMENTA INFORMATICAE, 2011, 113 (02) : 151 - 177
  • [22] Trainable Weights for Multitask Learning
    Ryu, Chaeeun
    Lee, Changwoo
    Choi, Hyuk Jin
    Lee, Chang-Hyun
    Jeon, Byoungjun
    Chie, Eui Kyu
    Kim, Young-Gon
    IEEE ACCESS, 2023, 11 (105633-105641) : 105633 - 105641
  • [23] On Multiplicative Multitask Feature Learning
    Wang, Xin
    Bi, Jinbo
    Yu, Shipeng
    Sun, Jiangwen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [24] Conic Programming for Multitask Learning
    Kato, Tsuyoshi
    Kashima, Hisahi
    Sugiyama, Masashi
    Asai, Kiyoshi
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2010, 22 (07) : 957 - 968
  • [25] A dozen tricks with multitask learning
    Caruana, Rich
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2012, 7700 LECTURE NO : 163 - 189
  • [26] Confidence Weighted Multitask Learning
    Yang, Peng
    Zhao, Peilin
    Zhou, Jiayu
    Gao, Xin
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5636 - 5643
  • [27] The Benefit of Multitask Representation Learning
    Maurer, Andreas
    Pontil, Massimiliano
    Romera-Paredes, Bernardino
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [28] Multiplicative Multitask Feature Learning
    Wang, Xin
    Bi, Jinbo
    Yu, Shipeng
    Sun, Jiangwen
    Song, Minghu
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [29] Multitask learning with data editing
    Bueno-Crespo, Andres
    Sanchez-Garcia, Antonio
    Morales-Sanchez, Juan
    Sancho-Gomez, Jose-Luis
    BIO-INSPIRED MODELING OF COGNITIVE TASKS, PT 1, PROCEEDINGS, 2007, 4527 : 320 - +
  • [30] Multitask learning with expert advice
    Abernethy, Jacob
    Bartlett, Peter
    Rakhlin, Alexander
    LEARNING THEORY, PROCEEDINGS, 2007, 4539 : 484 - +