AttributeError: ‘Qwen2ForCausalLM‘ object has no attribute ‘chat‘ 解决方案

解决方案是参照其它LLM给它补上chat()方法:

def chat(model, tok, ques, history=[], **kw):
    iids = tok.apply_chat_template(
		history + [{'role': 'user', 'content': ques}], 
		add_generation_prompt=1,
	)
	oids = model.generate(
		inputs=torch.tensor([iids]).to(model.device),
		**(model.generation_config.to_dict() | kw),
	)
	oids = oids[0][len(iids):].tolist()
	if oids[-1] == tok.eos_token_id:
		oids = oids[:-1]
	ans = tok.decode(oids)
	
	return ans

Qwen2ForCausalLM.chat = chat
posted @ 2024-03-31 14:31  绝不原创的飞龙  阅读(1560)  评论(0)    收藏  举报