共 37 条
- [1] On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? [J]. PROCEEDINGS OF THE 2021 ACM CONFERENCE ON FAIRNESS, ACCOUNTABILITY, AND TRANSPARENCY, FACCT 2021, 2021, : 610 - 623
- [2] Brown TB, 2020, ADV NEUR IN, V33
- [3] Chen M., 2021, arXiv
- [4] Chen Wenhu, 2022, Program of thoughts prompting: Disentangling computation from reasoning for numerical reasoning tasks
- [5] Chen ZY, 2021, 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), P3697
- [6] Chowdhery A., 2022, PaLM: Scaling language modeling with pathways
- [7] Cohen William W., 2021, MATE: multi-view attention for table transformer efficiency
- [8] Dong HY, 2022, PROCEEDINGS OF THE THIRTY-FIRST INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2022, P5426
- [9] Fu Yao, 2023, Complexity-based prompting for multi-step reasoning
- [10] Gong H., 2020, P 28 INT C COMP LING, P1978