[notes] notes of Transformer [6]
LAUSpectrum 2019-09-28 20:24
阅读:201
评论:0
推荐:0
当前标签:transformer
[notes] notes of Transformer [5]
LAUSpectrum 2019-09-28 20:14
阅读:169
评论:0
推荐:0
[paper] Multi-Task Learning for Abstractive and Extractive Summarization
LAUSpectrum 2019-09-17 21:11
阅读:249
评论:0
推荐:0
[paper] From Neural Sentence Summarization to Headline Generation: A Coarse-to-Fine Approach
LAUSpectrum 2019-09-16 18:22
阅读:268
评论:0
推荐:0
[paper] Scoring Sentence Singletons and Pairs for Abstractive Summarization
LAUSpectrum 2019-09-16 16:04
阅读:542
评论:0
推荐:0
[paper] Ontology-Aware Clinical Abstractive Summarization
LAUSpectrum 2019-09-15 22:18
阅读:204
评论:0
推荐:0
[paper] HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization
LAUSpectrum 2019-09-15 21:31
阅读:682
评论:0
推荐:0
[paper] On Extractive and Abstractive Neural Document Summarization with Transformer Language Models
LAUSpectrum 2019-09-14 22:42
阅读:879
评论:1
推荐:0
[paper] Hierarchical Transformers for Multi-Document Summarization
LAUSpectrum 2019-09-14 21:14
阅读:762
评论:0
推荐:0
[notes] notes of Transformer [4]
LAUSpectrum 2019-07-25 21:25
阅读:177
评论:0
推荐:0
[notes] notes of Transformer [3]
LAUSpectrum 2019-07-25 10:02
阅读:181
评论:1
推荐:0
[paper] Pay Less Attention with Lightweight and Dynamic Convolutions
LAUSpectrum 2019-07-21 15:15
阅读:740
评论:0
推荐:0
[report] Transformers and Pointer-Generator Networks for Abstractive Summarization
LAUSpectrum 2019-07-16 23:12
阅读:1252
评论:8
推荐:1
[report] Just News It: Abstractive Text Summarization with a Pointer-Generator Transformer
LAUSpectrum 2019-07-16 21:24
阅读:471
评论:1
推荐:1
[report] Faster Transformers for Text Summarization
LAUSpectrum 2019-07-16 17:32
阅读:306
评论:0
推荐:0
[report] Faster Transformers for Document Summarization
LAUSpectrum 2019-07-16 12:36
阅读:186
评论:0
推荐:0
[notes] notes of Transformer [2]
LAUSpectrum 2019-07-15 21:40
阅读:165
评论:0
推荐:0
[paper] Improving Language Understanding by Generative Pre-Training
LAUSpectrum 2019-07-12 12:01
阅读:1296
评论:0
推荐:0
[paper] Self-Attention with Relative Position Representations
LAUSpectrum 2019-07-11 21:21
阅读:664
评论:0
推荐:0
[paper] Efficient Adaptation of Pretrained Transformers for Abstractive Summarization
LAUSpectrum 2019-07-11 13:03
阅读:396
评论:0
推荐:0
浙公网安备 33010602011771号