T5大型语言模型

T5 and Large Language Model

T5

Text-to-Text Transfer Transformer.

see all NLP questions as A TEXT-TO-TEXT TASK

universal format: task description + sentence -> answer

Details

Pretrain: BERT-base-sized encoder-decoder transformer, denoising objective, C4 datasets

finetune: GLUE CNN abstract SQuAD

mT5

multilingual T5

Clozed Domain QA

T5.1.1 only pretrained on unsupervised data to get knowledge.

use salient span masking to mask entities

posted @ 2022-08-05 11:47  19376273  阅读(153)  评论(0)    收藏  举报