• 博客园logo
  • 会员
  • 众包
  • 新闻
  • 博问
  • 闪存
  • 赞助商
  • HarmonyOS
  • Chat2DB
    • 搜索
      所有博客
    • 搜索
      当前博客
  • 写随笔 我的博客 短消息 简洁模式
    用户头像
    我的博客 我的园子 账号设置 会员中心 简洁模式 ... 退出登录
    注册 登录
magicat
博客园    首页    新随笔    联系   管理    订阅  订阅
RTX 5060TI 部署llama-factory

搭建环境

NVIDIA驱动略,CUDA略

(llama-factory) D:\P\llm\LLaMA-Factory>nvcc -V
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2025 NVIDIA Corporation
Built on Wed_Apr__9_19:29:17_Pacific_Daylight_Time_2025
Cuda compilation tools, release 12.9, V12.9.41
Build cuda_12.9.r12.9/compiler.35813241_0

(llama-factory) D:\P\llm\LLaMA-Factory>nvidia-smi
Thu Oct 23 15:24:02 2025
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 576.88                 Driver Version: 576.88         CUDA Version: 12.9     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                  Driver-Model | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 5060 Ti   WDDM  |   00000000:01:00.0  On |                  N/A |
|  0%   38C    P0             24W /  180W |    2728MiB /  16311MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+

如果出现环境冲突,请尝试使用pip install --no-deps -e解决

conda create -n llama-factory python=3.11.11
conda activate llama-factory
git clone https://github.com/hiyouga/LLaMA-Factory.git
cd LLaMA-Factory
pip install -e ".[metrics]"
pip uninstall torch torchvision torchaudio -y
pip3 install torch torchvision --index-url https://download.pytorch.org/whl/cu128
pip install bitsandbytes

(llama-factory) D:\P\llm\LLaMA-Factory>python
Python 3.11.11 | packaged by Anaconda, Inc. | (main, Dec 11 2024, 16:34:19) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> print(torch.__version__)    # '2.9.0+cu128'
2.9.0+cu128
>>> print(torch.cuda.is_available())    # True
True
llamafactory-cli webui

http://localhost:7860/

本文来自博客园,作者:magicat,转载请注明原文链接:https://www.cnblogs.com/magicat/p/19160805

posted on 2025-10-23 15:25  magicat  阅读(4)  评论(0)    收藏  举报
刷新页面返回顶部
博客园  ©  2004-2025
浙公网安备 33010602011771号 浙ICP备2021040463号-3