共 80 条
- [1] Ainslie J., Lee-Thorp J., de Jong M., Zemlyanskiy Y., Lebron F., Sanghai S., GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints, EMNLP, pp. 4895-4901, (2023)
- [2] Anil R., Dai A.M., Et al., (2023)
- [3] Arroyo D.M., Postels J., Tombari F., Variational Transformer Networks for Layout Generation, CVPR, pp. 13642-13652, (2021)
- [4] Blumenthal S., Multinomial sampling with partially categorized data, Journal of the American Statistical Association, 63, 322, pp. 542-551, (1968)
- [5] Brown T., Mann B., Et al., Language Models are Few-Shot Learners, In: NeurIPS, 33, pp. 1877-1901, (2020)
- [6] Chai S., Zhuang L., Yan F., LayoutDM: Transformer-Based Diffusion Model for Layout Generation. In: CVPR, pp. 18349-18358, (2023)
- [7] Chowdhery A., Narang S., Et al., PaLM: Scaling Language Modeling with Pathways, Journal of Machine Learning Research, 24, 240, pp. 1-113, (2023)
- [8] Chung H.W., Hou L., Et al., Scaling Instruction-Finetuned Language Models, (2022)
- [9] Deka B., Huang Z., Franzen C., Hibschman J., Afergan D., Li Y., Nichols J., Kumar R., Rico: A Mobile App Dataset for Building Data-Driven Design Applications, UIST, pp. 845-854, (2017)
- [10] Devlin J., Chang M.W., Lee K., Toutanova K., BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding. In: NAACL, pp. 4171-4186, (2019)