ollama与嵌入模型适配踩坑

环境:redhat linux x86_64, ollama v0.1.44, bge-large-zh-v1.5.gguf
问题步骤:

  1. bge嵌入模型通过modelscope下载后,执行ollama create bge-large-zh-v1.5 --f Modelfile,导入嵌入模型
  2. 执行 curl http://localhost:11434/api/embeddings -d '{"model": "bge-large-zh-v1.5:latest", "prompt": "ollama如何使用自定义嵌入模型"}'
  3. 会出现以下错误
    msg="embedding generation failed: do embedding request: Post "http://127.0.0.1:51966/embedding": read tcp 127.0.0.1:51969->127.0.0.1:51966: wsarecv: An existing connection was forcibly closed by the remote host."

最终使用ollama pull quentinz/bge-large-zh-v1.5解决了问题

posted @ 2025-03-31 09:42  kayaker  阅读(1070)  评论(0)    收藏  举报