摘要:
from langchain_community.llms.ollama import Ollama from langchain_core.prompts import ChatPromptTemplate, PromptTemplate llm = Ollama(model="qwen:7b") 阅读全文
posted @ 2024-04-07 09:54
林**
阅读(91)
评论(0)
推荐(0)
摘要:
from transformers import AutoTokenizer, AutoModel modelPath = "/home/cmcc/server/model/chatglm3-6b" tokenizer = AutoTokenizer.from_pretrained(modelPat 阅读全文
posted @ 2024-04-07 09:44
林**
阅读(24)
评论(0)
推荐(0)

浙公网安备 33010602011771号