共 41 条
- [1] OUYANG, WU J, JIANG X, Et al., Training language models to follow instructions with human feedback [EB/OL]
- [2] SCHULMAN J, ZOPH B, KIM C, Et al., ChatGPT: Opti⁃ mizing language models for dialogue
- [3] BROWN T B, MANN B, RYDER N, Et al., Language mod⁃ els are few-shot learners, Proceedings of the 34th Inter⁃ national Conference on Neural Information Processing Sys⁃ tems, pp. 1877-1901, (2020)
- [4] LEWIS M, LIU Y H, GOYAL N, Et al., BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension, Proceed⁃ ings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 7871-7880, (2020)
- [5] TURING A M., Computing machinery and intelligence [M], Parsing the Turing Test, pp. 23-65, (2009)
- [6] ZHANG M., An inquiry into Yan fu’s translation theory of faithfulness, expressiveness, and elegance: The beginning of China’s modern translation theory, Trans-Humanities Journal, 6, 3, pp. 179-196, (2013)
- [7] RAFFEL C, SHAZEER N, ROBERTS A, Et al., Exploring the limits of transfer learning with a unified text-to-text transformer
- [8] BUBECK S, CHANDRASEKARAN V, ELDAN R, Et al., Sparks of Artificial General Intelligence: Early experi⁃ ments with GPT-4
- [9] VASWANI A, SHAZEER N, PARMAR N, Et al., Atten⁃ tion is all you need
- [10] DEVLIN J, CHANG M-W, LEE K, Et al., Bert: Pre-train⁃ ing of deep bidirectional transformers for language under⁃ standing, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computa⁃ tional Linguistics: Human Language Technologies, Vol⁃ ume 1 (Long and Short Papers), pp. 4171-4186, (2019)