T5大型语言模型
T5 and Large Language Model
T5
Text-to-Text Transfer Transformer.
see all NLP questions as A TEXT-TO-TEXT TASK
universal format: task description + sentence -> answer
Details
Pretrain: BERT-base-sized encoder-decoder transformer, denoising objective, C4 datasets
finetune: GLUE CNN abstract SQuAD
mT5
multilingual T5
Clozed Domain QA
T5.1.1 only pretrained on unsupervised data to get knowledge.
use salient span masking to mask entities

浙公网安备 33010602011771号