Paradigm Shift in Natural Language Processing

被引:0
作者
Tian-Xiang Sun
Xiang-Yang Liu
Xi-Peng Qiu
Xuan-Jing Huang
机构
[1] Fudan University,School of Computer Science
[2] Fudan University,Shanghai Key Laboratory of Intelligent Information Processing
来源
Machine Intelligence Research | 2022年 / 19卷
关键词
Natural language processing; pre-trained language models; deep learning; sequence-to-sequence; paradigm shift;
D O I
暂无
中图分类号
学科分类号
摘要
In the era of deep learning, modeling for most natural language processing (NLP) tasks has converged into several mainstream paradigms. For example, we usually adopt the sequence labeling paradigm to solve a bundle of tasks such as POS-tagging, named entity recognition (NER), and chunking, and adopt the classification paradigm to solve tasks like sentiment analysis. With the rapid progress of pre-trained language models, recent years have witnessed a rising trend of paradigm shift, which is solving one NLP task in a new paradigm by reformulating the task. The paradigm shift has achieved great success on many tasks and is becoming a promising way to improve model performance. Moreover, some of these paradigms have shown great potential to unify a large number of NLP tasks, making it possible to build a single model to handle diverse tasks. In this paper, we review such phenomenon of paradigm shifts in recent years, highlighting several paradigms that have the potential to solve different NLP tasks.
引用
收藏
页码:169 / 183
页数:14
相关论文
共 19 条
  • [1] Raffel C(2020)Exploring the limits of transfer learning with a unified text-to-text transformer Journal of Machine Learning Research 21 1-67
  • [2] Shazeer N(2020)Pre-trained models for natural language processing: A survey Science China Technological Sciences 63 1872-1897
  • [3] Roberts A(2020)How can we know what language models know? Transactions of the Association for Computational Linguistics 8 423-438
  • [4] Lee K(undefined)undefined undefined undefined undefined-undefined
  • [5] Narang S(undefined)undefined undefined undefined undefined-undefined
  • [6] Matena M(undefined)undefined undefined undefined undefined-undefined
  • [7] Zhou Y Q(undefined)undefined undefined undefined undefined-undefined
  • [8] Li W(undefined)undefined undefined undefined undefined-undefined
  • [9] Liu P J(undefined)undefined undefined undefined undefined-undefined
  • [10] Qiu X P(undefined)undefined undefined undefined undefined-undefined