搭建环境
NVIDIA驱动略,CUDA略
(llama-factory) D:\P\llm\LLaMA-Factory>nvcc -V
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2025 NVIDIA Corporation
Built on Wed_Apr__9_19:29:17_Pacific_Daylight_Time_2025
Cuda compilation tools, release 12.9, V12.9.41
Build cuda_12.9.r12.9/compiler.35813241_0
(llama-factory) D:\P\llm\LLaMA-Factory>nvidia-smi
Thu Oct 23 15:24:02 2025
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 576.88 Driver Version: 576.88 CUDA Version: 12.9 |
|-----------------------------------------+------------------------+----------------------+
| GPU Name Driver-Model | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+========================+======================|
| 0 NVIDIA GeForce RTX 5060 Ti WDDM | 00000000:01:00.0 On | N/A |
| 0% 38C P0 24W / 180W | 2728MiB / 16311MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
如果出现环境冲突,请尝试使用pip install --no-deps -e解决
conda create -n llama-factory python=3.11.11
conda activate llama-factory
git clone https://github.com/hiyouga/LLaMA-Factory.git
cd LLaMA-Factory
pip install -e ".[metrics]"
pip uninstall torch torchvision torchaudio -y
pip3 install torch torchvision --index-url https://download.pytorch.org/whl/cu128
pip install bitsandbytes
(llama-factory) D:\P\llm\LLaMA-Factory>python
Python 3.11.11 | packaged by Anaconda, Inc. | (main, Dec 11 2024, 16:34:19) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> print(torch.__version__) # '2.9.0+cu128'
2.9.0+cu128
>>> print(torch.cuda.is_available()) # True
True
llamafactory-cli webui
本文来自博客园,作者:magicat,转载请注明原文链接:https://www.cnblogs.com/magicat/p/19160805
浙公网安备 33010602011771号