共 9 条
[1]
What does BERT look at? An Analysis of BERT's Attention
[J].
BLACKBOXNLP WORKSHOP ON ANALYZING AND INTERPRETING NEURAL NETWORKS FOR NLP AT ACL 2019,
2019,
:276-286
[2]
Bringing Transparency Design into Practice
[J].
IUI 2018: PROCEEDINGS OF THE 23RD INTERNATIONAL CONFERENCE ON INTELLIGENT USER INTERFACES,
2018,
:211-223
[3]
Graves A., 2012, LONG SHORT TERM MEMO, DOI [10.1007/978-3-642-24797-2-4, DOI 10.1007/978-3-642-24797-2-4]
[4]
Honnibal M., 2017, spaCy 2: Natural language understanding with Bloom embeddings, convolutional neural networks and incremental parsing, DOI DOI 10.3233/978-1-60750-588-4-1080
[5]
Luong M.-T., 2015, LNCS, DOI DOI 10.18653/V1/D15-1166
[6]
Open Sourcing BERT, State-of-the-Art Pre-training for Natural Language Processing-Google AI Blog
[7]
Radford Alec, 2018, Improving language understanding by generative pre-training
[8]
Sak H, 2014, INTERSPEECH, P338
[9]
Vaswani A, 2017, ADV NEUR IN, V30