Human Language Understanding & Reasoning

被引:65
作者
Manning, Christopher D. [1 ,2 ,3 ,4 ]
机构
[1] Stanford Univ, Machine Learning, Stanford, CA 94305 USA
[2] Stanford Univ, Linguist & Comp Sci, Stanford, CA 94305 USA
[3] Stanford Artificial Intelligence Lab SAIL, Stanford, CA 94305 USA
[4] Assoc Computat Linguist, Stroudsburg, PA USA
关键词
D O I
10.1162/daed_a_01905
中图分类号
C [社会科学总论];
学科分类号
03 ; 0303 ;
摘要
The last decade has yielded dramatic and quite surprising breakthroughs in natural language processing through the use of simple artificial neural network computations, replicated on a very large scale and trained over exceedingly large amounts of data. The resulting pretrained language models, such as BERT and GPT-3, have provided a powerful universal language understanding and generation base, which can easily be adapted to many understanding, writing, and reasoning tasks. These models show the first inklings of a more general form of artificial intelligence, which may lead to powerful foundation models in domains of sensory experience beyond just language.
引用
收藏
页码:127 / 138
页数:12
相关论文
共 23 条
  • [1] Barr Avron, 1980, MAGAZINE FALL
  • [2] Bender E. M., 2020, P 58 ANN M ASS COMPU, P5185
  • [3] Formal Distributional Semantics: Introduction to the Special Issue
    Boleda, Gemma
    Herbelot, Aurelie
    [J]. COMPUTATIONAL LINGUISTICS, 2016, 42 (04) : 619 - 635
  • [4] Bommasani R., 2022, On the opportunities and risks of foundation models, DOI DOI 10.48550/ARXIV.2108.07258
  • [5] Brown TB, 2020, ADV NEUR IN, V33
  • [6] Carroll Glenn., 1992, Working Notes of the Workshop on Statistically-Based NLP Techniques, P1
  • [7] Caswell I., 2020, Google AI Blog
  • [8] Chevalier Monique, 1978, DESCRIPTION SYSTEME
  • [9] De Waal Frans., 2017, Are We Smart Enough to Know How Smart Animals Are?
  • [10] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171