Harukaze

 

2021年7月29日

【论文阅读】BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[arXiv2019]

摘要: 论文地址:https://arxiv.org/abs/1810.04805 代码地址(TensorFlow):https://github.com/google-research/bert Transformer详解:http://nlp.seas.harvard.edu/2018/04/03/at 阅读全文

posted @ 2021-07-29 17:00 Harukaze 阅读(535) 评论(0) 推荐(0)

导航