共 50 条
- [41] Look-Ahead Attention for Generation in Neural Machine Translation NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2017, 2018, 10619 : 211 - 223
- [46] Machine Translation for Indian Languages Utilizing Recurrent Neural Networks and Attention DISTRIBUTED COMPUTING AND OPTIMIZATION TECHNIQUES, ICDCOT 2021, 2022, 903 : 593 - 602
- [48] Neural Machine Translation Models with Attention-Based Dropout Layer CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 75 (02): : 2981 - 3009
- [49] Neural Machine Translation with Attention Based on a New Syntactic Branch Distance MACHINE TRANSLATION, CCMT 2019, 2019, 1104 : 47 - 57
- [50] Multi-Granularity Self-Attention for Neural Machine Translation 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 887 - 897