2019年NLP百篇经典论文及网址

Link,Paper,Type,Model,Date,Citations
https://arxiv.org/abs/1801.06146,Universal Language Model Fine-tuning for Text Classification,New Model ,ULMFiT,18/01/2018,525
https://arxiv.org/abs/1802.05365,Deep contextualized word representations,New Model ,ELMo,15/02/2018,2042
https://arxiv.org/abs/1810.04805,BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,New Model ,BERT,11/10/2018,3089
https://arxiv.org/abs/1812.06705,Conditional BERT Contextual Augmentation,Building on BERT,,17/12/2018,6
https://arxiv.org/abs/1901.04085,Passage Re-ranking with BERT,Building on BERT,,13/01/2019,37
https://arxiv.org/abs/1901.05287,Assessing BERT's Syntactic Abilities,Understanding BERT,,16/01/2019,33
https://arxiv.org/abs/1901.08634,A BERT Baseline for the Natural Questions,Applying BERT,,24/01/2019,19
https://arxiv.org/abs/1901.08746,BioBERT: a pre-trained biomedical language representation model for biomedical text mining,Domain-Specific,BioBERT,25/01/2019,93
https://arxiv.org/abs/1902.02671,BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning,Building on BERT,,07/02/2019,9
https://arxiv.org/abs/1902.04094,"BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model",Building on BERT,,11/02/2019,22
https://arxiv.org/abs/1902.10909,BERT for Joint Intent Classification and Slot Filling,Building on BERT,,28/02/2019,8
https://arxiv.org/abs/1903.06464,A Context-Aware Citation Recommendation Model with BERT and Graph Convolutional Networks,Building on BERT,,15/03/2019,3
https://arxiv.org/abs/1903.09588,Utilizing BERT for Aspect-Based Sentiment Analysis via Constructing Auxiliary Sentence,Applying BERT,,22/03/2019,15
https://arxiv.org/abs/1903.10318,Fine-tune BERT for Extractive Summarization,Building on BERT ,BERTSUM,25/03/2019,20
https://arxiv.org/abs/1903.10676,SciBERT: A Pretrained Language Model for Scientific Text,Domain-Specific,SciBERT,26/03/2019,41
https://arxiv.org/abs/1903.10972,Simple Applications of BERT for Ad Hoc Document Retrieval,Applying BERT,,26/03/2019,19
https://arxiv.org/abs/1903.12136,Distilling Task-Specific Knowledge from BERT into Simple Neural Networks,Compressing BERT,,28/03/2019,14
https://arxiv.org/abs/1904.00132,ANA at SemEval-2019 Task 3: Contextual Emotion detection in Conversations through hierarchical LSTMs and BERT,Building on BERT,,30/03/2019,6
https://arxiv.org/abs/1904.00962,Large Batch Optimization for Deep Learning: Training BERT in 76 minutes,Building on BERT,,01/04/2019,12
https://arxiv.org/abs/1904.01766,VideoBERT: A Joint Model for Video and Language Representation Learning,Building on BERT,videoBERT,03/04/2019,27
https://arxiv.org/abs/1904.02232,BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis,Building on BERT,,03/04/2019,19
https://arxiv.org/abs/1904.03323,Publicly Available Clinical BERT Embeddings,Domain-Specific,,06/04/2019,31
https://arxiv.org/abs/1904.03450,UM-IU@LING at SemEval-2019 Task 6: Identifying Offensive Tweets Using BERT and SVMs,Applying BERT,,06/04/2019,2
https://arxiv.org/abs/1904.03339,ThisIsCompetition at SemEval-2019 Task 9: BERT is unstable for out-of-domain samples,Building on BERT,JESSI,06/04/2019,1
https://arxiv.org/abs/1904.05342,ClinicalBERT: Modeling Clinical Notes and Predicting Hospital Readmission,Domain-Specific,ClinicalBERT,10/04/2019,5
https://arxiv.org/abs/1904.05255,Simple BERT Models for Relation Extraction and Semantic Role Labeling,Building on BERT,,10/04/2019,5
https://arxiv.org/abs/1904.06652,Data Augmentation for BERT Fine-Tuning in Open-Domain Question Answering,Building on BERT,,14/04/2019,2
https://arxiv.org/abs/1904.07531,Understanding the Behaviors of BERT in Ranking,Understanding BERT,,16/04/2019,10
https://arxiv.org/abs/1904.08398,DocBERT: BERT for Document Classification,Applying BERT,DocBERT,17/04/2019,14
https://arxiv.org/abs/1904.09077,"Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT",Multilingual BERT,,19/04/2019,20
https://arxiv.org/abs/1904.09675,BERTScore: Evaluating Text Generation with BERT,Applying BERT,,21/04/2019,15
https://arxiv.org/abs/1905.01758,Investigating the Successes and Failures of BERT for Passage Re-Ranking,Understanding BERT,,05/05/2019,8
https://arxiv.org/abs/1905.01780,Anonymized BERT: An Augmentation Approach to the Gendered Pronoun Resolution Challenge,Applying BERT,,06/05/2019,2
https://arxiv.org/abs/1905.02331,X-BERT: eXtreme Multi-label Text Classification with using Bidirectional Encoder Representations from Transformers,New Model ,X-BERT,07/05/2019,1
https://arxiv.org/abs/1905.05615,Transfer Learning for Scientific Data Chain Extraction in Small Chemical Corpus with BERT-CRF Model,Building on BERT,BERT-CRF,13/05/2019,2
https://arxiv.org/abs/1906.02124,PatentBERT: Patent Classification with Fine-Tuning a pre-trained BERT Model,Domain-Specific,PatentBERT,14/05/2019,3
https://arxiv.org/abs/1905.05412,BERT with History Answer Embedding for Conversational Question Answering,Applying BERT,,14/05/2019,3
https://arxiv.org/abs/1905.05583,How to Fine-Tune BERT for Text Classification?,Understanding BERT,,14/05/2019,11
https://arxiv.org/abs/1905.05950,BERT Rediscovers the Classical NLP Pipeline,Understanding BERT,,15/05/2019,34
https://arxiv.org/abs/1905.06638,Latent Universal Task-Specific BERT,Building on BERT,,16/05/2019,0
https://arxiv.org/abs/1905.07129,ERNIE: Enhanced Language Representation with Informative Entities,New Model,ERNIE ,17/05/2019,26
https://arxiv.org/abs/1905.07504,Story Ending Prediction by Transferable BERT,Applying BERT,TransBERT,17/05/2019,0
https://arxiv.org/abs/1905.07830,HellaSwag: Can a Machine Really Finish Your Sentence?,Understanding BERT,HellaSwag,19/05/2019,14
https://arxiv.org/abs/1905.10650,Are Sixteen Heads Really Better than One?,Understanding BERT,,25/05/2019,18
https://arxiv.org/abs/1905.13068,Unbabel's Submission to the WMT2019 APE Shared Task: BERT-based Encoder-Decoder for Automatic Post-Editing,Building on BERT,BED,30/05/2019,1
https://arxiv.org/abs/1905.12848,A Simple but Effective Method to Incorporate Multi-turn Context with BERT for Conversational Machine Comprehension,Building on BERT,,30/05/2019,1
https://arxiv.org/abs/1905.13497,Attention Is (not) All You Need for Commonsense Reasoning,Understanding BERT,,31/05/2019,1
https://arxiv.org/abs/1906.01161,Resolving Gendered Ambiguous Pronouns with BERT,Applying BERT,,03/06/2019,0
https://arxiv.org/abs/1906.01502,How multilingual is Multilingual BERT?,Multilingual BERT,,04/06/2019,16
https://arxiv.org/abs/1906.01698,Open Sesame: Getting Inside BERT's Linguistic Knowledge,Understanding BERT,,04/06/2019,7
https://arxiv.org/abs/1906.01502,How multilingual is Multilingual BERT?,Multilingual BERT,,04/06/2019,16
https://arxiv.org/abs/1906.02715,Visualizing and Measuring the Geometry of BERT,Understanding BERT,,06/06/2019,11
https://arxiv.org/abs/1906.04165,Leveraging BERT for Extractive Text Summarization on Lectures,Applying BERT,,07/06/2019,2
https://arxiv.org/abs/1906.03695,Gendered Pronoun Resolution using BERT and an extractive question answering formulation,Applying BERT,,09/06/2019,1
https://arxiv.org/abs/1906.04341,What Does BERT Look At? An Analysis of BERT's Attention,Understanding BERT,,11/06/2019,28
https://arxiv.org/abs/1906.05474,Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets,Domain-Specific,,13/06/2019,9
https://arxiv.org/abs/1906.08101,Pre-Training with Whole Word Masking for Chinese BERT,Multilingual BERT,,19/06/2019,10
https://arxiv.org/abs/1906.08237,XLNet: Generalized Autoregressive Pretraining for Language Understanding,New Model ,XLNet,19/06/2019,221
https://arxiv.org/abs/1906.11565,EmotionX-KU: BERT-Max based Contextual Emotion Classifier,Building on BERT ,,27/06/2019,1
https://arxiv.org/abs/1906.11511,Inducing Syntactic Trees from BERT Representations,Understanding BERT,,27/06/2019,0
https://arxiv.org/abs/1907.03040,BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer,New Model ,BERT-DST,05/07/2019,4
https://arxiv.org/abs/1907.02884,Multi-lingual Intent Detection and Slot Filling in a Joint BERT-based Model,Building on BERT ,Bert-Joint,05/07/2019,1
https://arxiv.org/abs/1907.06226,A Simple BERT-Based Approach for Lexical Simplification,Building on BERT ,,15/07/2019,0
https://arxiv.org/abs/1907.09669,EmotionX-HSU: Adopting Pre-trained BERT for Emotion Classification,Building on BERT ,,23/07/2019,1
https://arxiv.org/abs/1907.10529,SpanBERT: Improving Pre-training by Representing and Predicting Spans,Building on BERT,SpanBERT,24/07/2019,22
https://arxiv.org/abs/1907.11692,RoBERTa: A Robustly Optimized BERT Pretraining Approach,New Model,RoBERTa,26/07/2019,60
https://arxiv.org/abs/1907.11932,Is BERT Really Robust? A Strong Baseline for Natural Language Attack on Text Classification and Entailment,Understanding BERT,TextFooler,27/07/2019,3
https://arxiv.org/abs/1907.12679,Machine Translation Evaluation with BERT Regressor,Building on BERT ,,29/07/2019,0
https://arxiv.org/abs/1907.12412,ERNIE 2.0: A Continual Pre-training Framework for Language Understanding,New Model ,Ernie 2.0,29/07/2019,6
https://arxiv.org/abs/1907.13528,What BERT is not: Lessons from a new suite of psycholinguistic diagnostics for language models,Understanding BERT,,31/07/2019,2
https://arxiv.org/abs/1908.00308,MSnet: A BERT-based Network for Gendered Pronoun Resolution,Building on BERT ,,01/08/2019,1
https://arxiv.org/abs/1908.01767,Exploring Neural Net Augmentation to BERT for Question Answering on SQUAD 2.0,Building on BERT ,,04/08/2019,0
https://arxiv.org/abs/1908.02451,TinySearch -- Semantics based Search Engine using Bert Embeddings,Building on BERT ,,07/08/2019,0
https://arxiv.org/abs/1908.03548,BERT-based Ranking for Biomedical Entity Normalization,Domain-Specific,,09/08/2019,1
https://arxiv.org/abs/1908.04812,Domain Adaptive Training BERT for Response Selection,Building on BERT ,,13/08/2019,0
https://arxiv.org/abs/1908.04577,StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding,Building on BERT,StructBERT,13/08/2019,1
https://arxiv.org/abs/1908.04943,"Establishing Strong Baselines for the New Decade: Sequence Tagging, Syntactic and Semantic Parsing with BERT",Building on BERT ,,14/08/2019,0
https://arxiv.org/abs/1908.05672,Towards Making the Most of BERT in Neural Machine Translation,Building on BERT ,,15/08/2019,4
https://arxiv.org/abs/1908.05787,M-BERT: Injecting Multimodal Information in the BERT Structure,Building on BERT ,M-BERT,15/08/2019,2
https://arxiv.org/abs/1908.05646,SenseBERT: Driving Some Sense into BERT,New Model ,SenseBERT,15/08/2019,1
https://arxiv.org/abs/1908.05620,Visualizing and Understanding the Effectiveness of BERT,Understanding BERT,,15/08/2019,1
https://arxiv.org/abs/1908.05908,BERT-Based Multi-Head Selection for Joint Entity-Relation Extraction,Building on BERT ,,16/08/2019,0
https://arxiv.org/abs/1908.06264,EmotionX-IDEA: Emotion BERT -- an Affectional Model for Conversation,Building on BERT ,,17/08/2019,1
https://arxiv.org/abs/1908.06780,A Study of BERT for Non-Factoid Question-Answering under Passage Length Constraints,Building on BERT ,,19/08/2019,0
https://arxiv.org/abs/1908.06926,Neural Architectures for Nested NER through Linearization,Applying BERT,,19/08/2019,4
https://arxiv.org/abs/1908.07245,GlossBERT: BERT for Word Sense Disambiguation with Gloss Knowledge,Building on BERT ,,20/08/2019,0
https://arxiv.org/abs/1908.07721,Fine-tuning BERT for Joint Entity and Relation Extraction in Chinese Medical Text,Domain-Specific,,21/08/2019,0
https://arxiv.org/abs/1908.08593,Revealing the Dark Secrets of BERT,Understanding BERT,,21/08/2019,3
https://arxiv.org/abs/1908.08530,VL-BERT: Pre-training of Generic Visual-Linguistic Representations,New Model ,VL-BERT,22/08/2019,15
https://arxiv.org/abs/1908.09091,BERT for Coreference Resolution: Baselines and Analysis,Applying BERT,,24/08/2019,1
https://arxiv.org/abs/1908.09355,Patient Knowledge Distillation for BERT Model Compression,Compressing BERT,,25/08/2019,4
https://arxiv.org/abs/1908.09892,Does BERT agree? Evaluating knowledge of structure dependence through agreement relations,Understanding BERT,,26/08/2019,1
https://arxiv.org/abs/1908.10084,Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks,Building on BERT ,Sentence-BERT,27/08/2019,4
https://arxiv.org/abs/1908.11860,Adapt or Get Left Behind: Domain Adaptation through BERT Language Model Finetuning for Aspect-Target Sentiment Classification,Building on BERT ,,30/08/2019,2
https://arxiv.org/abs/1909.00109,Giving BERT a Calculator: Finding Operations and Arguments with Reading Comprehension,Building on BERT ,,31/08/2019,1
https://arxiv.org/abs/1909.00100,Small and Practical BERT Models for Sequence Labeling,Compressing BERT,,31/08/2019,3
https://arxiv.org/abs/1909.00578,SumQE: a BERT-based Summary Quality Estimation Model,Building on BERT ,SumQE,02/09/2019,1
https://arxiv.org/abs/1909.00512,"How Contextual are Contextualized Word Representations? Comparing the Geometry of BERT, ELMo, and GPT-2 Embeddings",Understanding BERT,,02/09/2019,0
https://arxiv.org/abs/1909.00931,Transfer Fine-Tuning: A BERT Case Study,Building on BERT ,,03/09/2019,0
https://arxiv.org/abs/1909.02209,Semantics-aware BERT for Language Understanding,Building on BERT ,SemBERT,05/09/2019,3
https://arxiv.org/abs/1909.02597,Investigating BERT's Knowledge of Language: Five Analysis Methods with NPIs,Understanding BERT,,05/09/2019,2
https://arxiv.org/abs/1909.03193,KG-BERT: BERT for Knowledge Graph Completion,Building on BERT ,KG-BERT,07/09/2019,0
https://arxiv.org/abs/1909.03223,Deleter: Leveraging BERT to Perform Unsupervised Successive Text Compression,Applying BERT,,07/09/2019,0
https://arxiv.org/abs/1909.03415,Commonsense Knowledge + BERT for Level 2 Reading Comprehension Ability Test,Building on BERT ,,08/09/2019,0
https://arxiv.org/abs/1909.03405,Symmetric Regularization based BERT for Pair-wise Semantic Reasoning,Building on BERT ,,08/09/2019,0
https://arxiv.org/abs/1909.04181,BERT-Based Arabic Social Media Author Profiling,Applying BERT,,09/09/2019,1
https://arxiv.org/abs/1909.04925,How Does BERT Answer Questions? A Layer-Wise Analysis of Transformer Representations,Understanding BERT,,11/09/2019,0
https://arxiv.org/abs/1909.05840,Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT,Compressing BERT,Q-BERT,12/09/2019,6
https://arxiv.org/abs/1909.06775,Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing,Building on BERT ,CLBT,15/09/2019,3
https://arxiv.org/abs/1909.07606,K-BERT: Enabling Language Representation with Knowledge Graph,Building on BERT ,K-BERT,17/09/2019,1
https://arxiv.org/abs/1909.08402,Enriching BERT with Knowledge Graph Embeddings for Document Classification,Building on BERT ,,18/09/2019,0
https://arxiv.org/abs/1909.08358,Using BERT for Word Sense Disambiguation,Applying BERT,,18/09/2019,1
https://arxiv.org/abs/1909.09292,BERT Meets Chinese Word Segmentation,Applying BERT,,19/09/2019,1
https://arxiv.org/abs/1908.08167,Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering,Building on BERT ,,22/09/2019,3
https://arxiv.org/abs/1909.10351,TinyBERT: Distilling BERT for Natural Language Understanding,Compressing BERT,TinyBERT,23/09/2019,5
https://arxiv.org/abs/1909.10430,Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized Embeddings,Understanding BERT,,23/09/2019,0
https://arxiv.org/abs/1909.10649,Portuguese Named Entity Recognition using BERT-CRF,Applying BERT,,23/09/2019,0
https://arxiv.org/abs/1909.11764,FreeLB: Enhanced Adversarial Training for Language Understanding,Building on BERT,FreeLB,25/09/2019,4
https://openreview.net/forum?id=SJxjVaNKwB,MobileBERT: Task-Agnostic Compression of BERT by Progressive Knowledge Transfer ,Compressing BERT,MobileBERT,25/09/2019,0
https://arxiv.org/abs/1909.11898,Fine-tune Bert for DocRED with Two-step Process,Applying BERT,,26/09/2019,0
https://arxiv.org/abs/1909.12744,On the use of BERT for Neural Machine Translation,Building on BERT ,,27/09/2019,0
https://arxiv.org/abs/1909.11942,ALBERT: A Lite BERT for Self-supervised Learning of Language Representations,New Model ,ALBERT,28/09/2019,19
https://arxiv.org/abs/1910.03089,End-to-End Resume Parsing and Finding Candidates for a Job Description using BERT,Applying BERT,,30/09/2019,0
https://arxiv.org/abs/1910.01157,Cracking the Contextual Commonsense Code: Understanding Commonsense Reasoning Aptitude of Deep Contextual Representations,Understanding BERT,,02/10/2019,0
https://arxiv.org/abs/1910.01108,"DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter",Compressing BERT,DistilBERT,02/10/2019,11
https://arxiv.org/abs/1910.00883,Exploiting BERT for End-to-End Aspect-based Sentiment Analysis,Applying BERT,,02/10/2019,0
https://arxiv.org/abs/1910.03474,Fine-grained Sentiment Classification using BERT,Applying BERT,,04/10/2019,0
https://arxiv.org/abs/1910.02655,BERT for Evidence Retrieval and Claim Verification,Applying BERT,,07/10/2019,0
https://arxiv.org/abs/1910.03806,Is Multilingual BERT Fluent in Language Generation?,Multilingual BERT,,09/10/2019,2
https://arxiv.org/abs/1910.05786,Progress Notes Classification and Keyword Extraction using Attention-based Deep Learning Models with BERT,Applying BERT,,13/10/2019,0
https://arxiv.org/abs/1910.06188,Q8BERT: Quantized 8Bit BERT,Compressing BERT,,14/10/2019,2
https://arxiv.org/abs/1910.06431,Whatcha lookin' at? DeepLIFTing BERT's Attention in Question Answering,Understanding BERT,DeepLIFT,14/10/2019,0
https://arxiv.org/abs/1910.06360,Pruning a BERT-based Question Answering Model,Compressing BERT,,14/10/2019,0
https://arxiv.org/abs/1910.07179,Content Enhanced BERT-based Text-to-SQL Generation,Applying BERT ,,16/10/2019,0
https://arxiv.org/abs/1910.07973,Universal Text Representation from BERT: An Empirical Study,Understanding BERT,,17/10/2019,0
https://arxiv.org/abs/1910.12647,HUBERT Untangles BERT to Improve Transfer across NLP Tasks,New Model ,HUBERT,25/10/2019,0
https://arxiv.org/abs/1910.12366,Thieves on Sesame Street! Model Extraction of BERT-based APIs,Understanding BERT,,27/10/2019,0
https://arxiv.org/abs/1910.12574,A BERT-Based Transfer Learning Approach for Hate Speech Detection in Online Social Media,Applying BERT,,28/10/2019,0
https://arxiv.org/abs/1910.12391,What does BERT Learn from Multiple-Choice Reading Comprehension Datasets?,Understanding BERT,,28/10/2019,0
https://arxiv.org/abs/1910.12995,A Simple but Effective BERT Model for Dialog State Tracking on Resource-Limited Systems,Compressing BERT,,29/10/2019,0
https://arxiv.org/abs/1910.14549,Positional Attention-based Frame Identification with BERT: A Deep Learning Approach to Target Disambiguation and Semantic Frame Selection,Building on BERT ,PAFIBERT,31/10/2019,0
https://arxiv.org/abs/1910.14424,Multi-Stage Document Ranking with BERT,Building on BERT ,monoBERT and duoBERT,31/10/2019,1
https://arxiv.org/abs/1910.14296,LIMIT-BERT : Linguistic Informed Multi-Task BERT,Building on BERT ,LIMIT-BERT,31/10/2019,1
https://arxiv.org/abs/1910.14243,DiaNet: BERT and Hierarchical Attention Multi-Task Learning of Fine-Grained Dialect,Applying BERT,,31/10/2019,0
https://arxiv.org/abs/1911.00473,BERT Goes to Law School: Quantifying the Competitive Advantage of Access to Large Legal Corpora in Contract Understanding,Domain-Specific,,01/11/2019,0
https://arxiv.org/abs/1911.00637,Sentence-Level BERT and Multi-Task Learning of Age and Gender in Social Media,Applying BERT,,02/11/2019,0
https://arxiv.org/abs/1811.01088,Sentence Encoders on STILTs: Supplementary Training on Intermediate Labeled-data Tasks,Building on BERT ,BERT on STILTs,02/11/2019,24
https://arxiv.org/abs/1911.06241,BERT-CNN: a Hierarchical Patent Classifier Based on a Pre-Trained Language Model,Domain-Specific,BERT-CNN,03/11/2019,0
https://arxiv.org/abs/1911.01528,BAS: An Answer Selection Method Using BERT Language Model,Applying BERT,,04/11/2019,0
https://arxiv.org/abs/1911.01940,Deepening Hidden Representations from Pre-trained Language Models for Natural Language Understanding,Building on BERT,HIRE-RoBERTa,05/11/2019,0
https://arxiv.org/abs/1911.02365,Learning to Answer by Learning to Ask: Getting the Best of GPT-2 and BERT Worlds,Building on BERT ,,06/11/2019,0
https://arxiv.org/abs/1911.02969,BERTs of a feather do not generalize together: Large variability in generalization across models with similar test set performance,Understanding BERT,,07/11/2019,1
https://arxiv.org/abs/1911.03310,How Language-Neutral is Multilingual BERT?,Multilingual BERT,,08/11/2019,0
https://arxiv.org/abs/1911.03437,SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization,New Model ,SMART,08/11/2019,0
https://arxiv.org/abs/1911.03681,BERT is Not a Knowledge Base (Yet): Factual Knowledge vs. Name-Based Reasoning in Unsupervised QA,Building on BERT ,E-BERT,09/11/2019,0
https://arxiv.org/abs/1911.03918,Improving BERT Fine-tuning with Embedding Normalization,Understanding BERT,,10/11/2019,0
https://arxiv.org/abs/1911.06156,Syntax-Infused Transformer and BERT models for Machine Translation and Natural Language Understanding,Building on BERT ,,10/11/2019,0
https://arxiv.org/abs/1911.03829,Distilling the Knowledge of BERT for Text Generation,Building on BERT,,10/11/2019,0
https://arxiv.org/abs/1911.04525,Understanding BERT performance in propaganda analysis,Applying BERT,,11/11/2019,0
https://arxiv.org/abs/1911.05758,"What do you mean, BERT? Assessing BERT as a Distributional Semantics Model",Understanding BERT,,13/11/2019,0
https://arxiv.org/abs/1912.01389,Towards Lingua Franca Named Entity Recognition with BERT,Multilingual BERT,,19/11/2019,0
https://arxiv.org/abs/1911.12246,Do Attention Heads in BERT Track Syntactic Dependencies?,Understanding BERT,,27/11/2019,0
https://arxiv.org/abs/1911.12753,Inducing Relational Knowledge from BERT,Understanding BERT,,28/11/2019,0
https://arxiv.org/abs/1912.05308,Unsupervised Transfer Learning via BERT Neuron Selection,Understanding BERT,,10/12/2019,0
https://arxiv.org/abs/1912.05238,BERT has a Moral Compass: Improvements of ethical and moral values of machines,Understanding BERT,,11/12/2019,0
https://arxiv.org/abs/1912.07076,Multilingual is not enough: BERT for Finnish,Multilingual BERT,,15/12/2019,0
https://arxiv.org/abs/1912.07840,Cross-Lingual Ability of Multilingual BERT: An Empirical Study,Multilingual BERT,,17/12/2019,0
https://arxiv.org/abs/1912.09582,BERTje: A Dutch BERT Model,Multilingual BERT,BERTje: A Dutch BERT Model,19/12/2019,0
https://nlp.stanford.edu/pubs/hewitt2019structural.pdf,A Structural Probe for Finding Syntax in Word Representations,Understanding BERT,,/06/2019,40
posted @ 2020-02-03 13:55  今夜无风  阅读(790)  评论(0编辑  收藏  举报