How to coding with llm
啓發於: https://linux.do/t/topic/126077/7
Local Server: ollama
curl -fsSL https://ollama.com/install.sh | sh
Enable LAN access
sudo vim /etc/systemd/system/ollama.service
[Service]
Environment="PATH=/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/home/bgzo/.sdkman/candidates/java/current/bin:/home/bgzo/.nvm/versions/node/v23.3.0/bin:/home/bgzo/demo/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/home/bgzo/.local/bin"
Environment="OLLAMA_HOST=0.0.0.0"
Restart
systemctl daemon-reload
systemctl restart ollama
systemctl status ollama
Module
Deepseek by China
Extension
Continue
high CPU usage cause multi files to build index, via: https://github.com/continuedev/continue/issues/1622, https://github.com/continuedev/continue/issues/866, https://github.com/continuedev/continue/issues/778, https://utgd.net/article/20938
References
- https://medium.com/@smfraser/how-to-use-a-local-llm-as-a-free-coding-copilot-in-vs-code-6dffc053369d
Source via: https://note.bgzo.cc/how-to/coding-with-llm

浙公网安备 33010602011771号