共 45 条
[1]
Chung J., Gulcehre C., Cho K., Bengio Y., Empirical evaluation of gated recurrent neural networks on sequence modeling, (2014)
[2]
Kowsari K., Heidarysafa M., Brown D. E., Meimandi K. J., Barnes L. E., Rmdl: Random multimodel deep learning for classification, Proc. of the 2nd Int. Conf. on Information System and Data Mining, pp. 19-28, (2018)
[3]
Robertson S., Understanding inverse document frequency: On theoretical arguments for IDF, Journal of Documentation, 60, 5, pp. 503-520, (2004)
[4]
Wang B., Klabjan D., Regularization for unsupervised deep neural nets, Thirty-First AAAI Conf. on Artificial Intelligence, pp. 2681-2687, (2017)
[5]
Kubo Y., Tucker G., Wiesler S., Compacting neural network classifiers via dropout training, (2016)
[6]
Gal Y., Ghahramani Z., A theoretically grounded application of dropout in recurrent neural networks, Proc. of the Advances in Neural Information Processing Systems, pp. 1019-1027, (2016)
[7]
Srivastava N., Hinton G., Krizhevsky A., Sutskever I., Salakhutdinov R., Dropout: A simple way to prevent neural networks from overfitting, The Journal of Machine Learning Research, 15, 1, pp. 1929-1958, (2014)
[8]
Ioffe S., Szegedy C., Batch normalization: Accelerating deep network training by reducing internal covariate shift, Int. Conf. on Machine Learning, PMLR, Lillie, France, pp. 448-456, (2015)
[9]
Xiang S., Li H., On the effects of batch and weight normalization in generative adversarial networks, (2017)
[10]
Simard P.Y., Steinkras D., Platt J.C., Best practices for convolutional neural networks applied to visual document analysis, Proc. of the Seventh Int. Conf. on Document Analysis and Recognition, pp. 958-962, (2003)