【论文阅读】ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators[ICLR2020]
摘要:
论文地址:https://arxiv.org/pdf/2003.10555.pdf 预训练模型及代码地址(Tensorflow):https://github.com/google-research/electra 文章后半部分有些翻译过来还是有些模糊,谨慎阅读! ABSTRACT 蒙面语言建模(M 阅读全文
posted @ 2021-08-01 17:58 Harukaze 阅读(734) 评论(0) 推荐(0)
浙公网安备 33010602011771号