huggingface-transformers 快速上手bert 使用注意

你可能看过这篇:https://blog.csdn.net/qiguanjiezl/article/details/122504454

这个代码就能用,但是注意一点:不要自己下载离线model然后加载

 

因为:1. 你下载的可能版本不匹配

           2. 目前国内应该是有自动镜像,直接  model = BertForMaskedLM.from_pretrained('bert-base-uncased')   就行,下载速度很快。

 

可能提示:需要什么权限。

退出shell 或者 window cmd,选择sudo,或者window: 以管理员身份运行,打开 cmd窗口

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

官网的例程:https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertForMaskedLM

 

直接运行。注意结果需要print

print(tokenizer.decode(predicted_token_id))

 

可能打印:

只是info不是error,自己翻译一下就行了

Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertForMaskedLM: ['cls.seq_relationship.weight', 'cls.seq_relationship.bias']
- This IS expected if you are initializing BertForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

 

最后给新手推荐一个讲解比较好的:https://www.bilibili.com/video/BV1Ao4y1k7AX/

posted @ 2023-04-09 23:14  园友1683564  阅读(434)  评论(0)    收藏  举报