共 46 条
[1]
Rumelhart D., Hinton G.E., Williams R.J., Learning Internal Representations by Error Propagation, (1986)
[2]
Hochreiter S., Schmidhuber J., Long short-term memory, Neural Comput, 9, 8, pp. 1735-1780, (1997)
[3]
Cho K., van Merrienboer B., Bahdanau D., Bengio Y., On the properties of neural machine translation: Encoder-decoder approaches, . Arxiv Preprint Arxiv, 1409, (2014)
[4]
Chen M.X., Firat O., Bapna A., Johnson M., Macherey W., Foster G., Jones L., Schuster M., Shazeer N., Parmar N., Vaswani A., Uszkoreit J., Kaiser L., Chen Z., Wu Y., Hughes M., The best of both worlds: Combining recent advances in neural machine translation, Proceedings of the 56Th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers, pp. 76-86
[5]
Wang C., Wu S., Liu S., Accelerating transformer decoding via a hybrid of self-attention and recurrent neural network, (2019)
[6]
Zhang L., Wang S., Liu B., Deep learning for sentiment analysis: A survey
[7]
You Q., Jin H., Wang Z., Fang C., Luo J., Image captioning with semantic attention, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2016)
[8]
Mao J., Xu W., Yang Y., Wang J., Huang Z., Yuille A., Deep Captioning with Multimodal Recurrent Neural Networks (M-Rnn), (2014)
[9]
Streaming end-to-end speech recognition for mobile devices, In: ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 6381-6385, (2019)
[10]
Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., Gomez A.N., Kaiser L.U, Polosukhin I. Attention is all you need, Advances in Neural Information Processing Systems 30, pp. 5998-6008, (2017)