摘要: 1 import torch 2 from transformers import BertTokenizer, BertModel, BertForMaskedLM 3 from transformers import logging 4 logging.set_verbosity_error() 阅读全文
posted @ 2022-04-24 20:25 zxcayumi 阅读(215) 评论(0) 推荐(0) 编辑
摘要: https://github.com/MLNLP-World/Paper_Writing_Tips 阅读全文
posted @ 2022-04-22 20:00 zxcayumi 阅读(20) 评论(0) 推荐(1) 编辑
摘要: 1 #!/usr/bin/env Python 2 # coding=utf-8 3 4 from transformers import GPT2LMHeadModel, GPT2Tokenizer 5 import torch 6 7 # 初始化GPT2模型的Tokenizer类. 8 toke 阅读全文
posted @ 2022-04-13 18:37 zxcayumi 阅读(105) 评论(0) 推荐(0) 编辑
摘要: import torch.nn as nn x = F.log_softmax(x) y = F.softmax(y, dim=1) criterion = nn.KLDivLoss() klloss = criterion(x, y) 输入x :(自己生成的标签)需要经过log_softmax层, 阅读全文
posted @ 2022-03-23 12:39 zxcayumi 阅读(534) 评论(0) 推荐(0) 编辑
摘要: WWW2022 | 知识提示的预训练微调 (qq.com) NLP新范式:从Fine-tuning到Prompt,再到AdaPrompt工作赏析 (qq.com) Papers: thunlp/PromptPapers: Must-read papers on prompt-based tuning 阅读全文
posted @ 2022-03-01 15:11 zxcayumi 阅读(385) 评论(0) 推荐(0) 编辑
摘要: 查询CCF相关会议Deadline。 ccf-deadlines (ccfddl.github.io) 阅读全文
posted @ 2022-02-21 14:34 zxcayumi 阅读(13) 评论(0) 推荐(0) 编辑
摘要: 提示学习相关资料整理 开源工具: https://github.com/thunlp/OpenPrompt 论文列表: https://github.com/thunlp/PromptPapers 阅读全文
posted @ 2022-02-21 09:44 zxcayumi 阅读(119) 评论(0) 推荐(0) 编辑
摘要: 1 #!/usr/bin/python 2 # coding:utf8 3 """ 4 Created on 2018-03-13 5 Updated on 2018-03-13 6 Author: 片刻 7 GitHub: https://github.com/apachecn/AiLearnin 阅读全文
posted @ 2022-02-20 15:23 zxcayumi 阅读(47) 评论(0) 推荐(0) 编辑
摘要: 1 import torch 2 import torch.nn as nn 3 import matplotlib.pyplot as plt 4 import numpy as np 5 6 # 参考 https://blog.csdn.net/jizhidexiaoming/article/d 阅读全文
posted @ 2022-02-20 15:21 zxcayumi 阅读(213) 评论(0) 推荐(0) 编辑