728x90

출처

python 가상 환경 만들기

[bluesanta@localhost local]$ sudo ln -s /home/bluesanta/Applications/stable-diffusion /usr/local/stable-diffusion
[bluesanta@localhost ~]$ cd /usr/local/stable-diffusion/
[bluesanta@localhost stable-diffusion]$ python3.11 -m venv .venv

python 가상 환경 실행

[bluesanta@localhost stable-diffusion]$ source .venv/bin/activate
(.venv) [bluesanta@localhost stable-diffusion]$

pip 업그레이드

(.venv) [bluesanta@localhost stable-diffusion]$ python -m pip install --upgrade pip

ComfyUI 다운로드

(.venv) [bluesanta@localhost stable-diffusion]$ git clone https://github.com/comfyanonymous/ComfyUI.git

ComfyUI 관련 패키지 설치

(.venv) [bluesanta@localhost stable-diffusion]$ cd ComfyUI/
(.venv) [bluesanta@localhost ComfyUI]$ pip install -r requirements.txt

ComfyUI 실행

(.venv) [bluesanta@localhost ComfyUI]$ python main.py
Checkpoint files will always be loaded safely.
Total VRAM 24090 MB, total RAM 128222 MB
pytorch version: 2.7.0+cu126
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
Using pytorch attention
Python version: 3.11.12 (main, Apr 22 2025, 23:29:55) [GCC 11.5.0 20240719 (Red Hat 11.5.0-5)]
ComfyUI version: 0.3.30
****** User settings have been changed to be stored on the server instead of browser storage. ******
****** For multi-user setups add the --multi-user CLI argument to enable multiple user profiles. ******
ComfyUI frontend version: 1.17.11
[Prompt Server] web root: /home/bluesanta/Applications/stable-diffusion/.venv/lib/python3.11/site-packages/comfyui_frontend_package/static

Import times for custom nodes:
   0.0 seconds: /home/bluesanta/Applications/stable-diffusion/ComfyUI/custom_nodes/websocket_image_save.py

Starting server

To see the GUI go to: http://127.0.0.1:8188

ComfyUI Manager 설치

(.venv) [bluesanta@localhost ComfyUI]$ cd custom_nodes
(.venv) [bluesanta@localhost custom_nodes]$ git clone https://github.com/ltdrdata/ComfyUI-Manager.git

stable-diffusion-webui 모델 공유

(.venv) [bluesanta@localhost custom_nodes]$ cd /usr/local/stable-diffusion/ComfyUI/
(.venv) [bluesanta@localhost ComfyUI]$ cp extra_model_paths.yaml.example extra_model_paths.yaml
(.venv) [bluesanta@localhost ComfyUI]$ vi extra_model_paths.yaml
#    base_path: path/to/stable-diffusion-webui/
    base_path: /usr/local/stable-diffusion/stable-diffusion-webui/

ComfyUI 외부접속 허용

(.venv) [bluesanta@localhost ComfyUI]$ python main.py --listen 0.0.0.0

8188 포트 개방

[bluesanta@localhost ~]$ sudo firewall-cmd --permanent --zone=public --add-port=8188/tcp
success
[bluesanta@localhost ~]$ sudo firewall-cmd --reload
success
[bluesanta@localhost ~]$ sudo firewall-cmd --list-ports
3389/tcp 7860/tcp 8188/tcp
728x90
728x90

출처

 

stable-diffusion-webui 설치

[bluesanta@localhost ~]$ cd Applications
[bluesanta@localhost Applications]$ mkdir stable-diffusion
[bluesanta@localhost stable-diffusion]$ git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui.git

python 가상 환경 만들기

[bluesanta@localhost stable-diffusion]$ python3.11 -m venv .venv

python 가상 환경 실행

[bluesanta@localhost stable-diffusion]$ source .venv/bin/activate
(.venv) [bluesanta@localhost stable-diffusion]$

stable-diffusion-webui 실행

(.venv) [bluesanta@localhost stable-diffusion]$ cd stable-diffusion-webui/
(.venv) [bluesanta@localhost stable-diffusion-webui]$ ./webui.sh
 
Loading weights [6ce0161689] from /home/bluesanta/Applications/stable-diffusion/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors
Running on local URL:  http://127.0.0.1:7860

stable-diffusion-webui 외부 접속 허용

(.venv) [bluesanta@localhost stable-diffusion-webui]$ vi webui-user.sh
export COMMANDLINE_ARGS="--xformers --xformers-flash-attention --share --listen --gradio-auth bluexmas:passwd"

7860 포트 개방

(.venv) [bluesanta@localhost stable-diffusion-webui]$ sudo firewall-cmd --add-port=7860/tcp --permanent
success
(.venv) [bluesanta@localhost stable-diffusion-webui]$ sudo firewall-cmd --reload
success
(.venv) [bluesanta@localhost stable-diffusion-webui]$ sudo firewall-cmd --list-all
public (active)
  target: default
  icmp-block-inversion: no
  interfaces: enp5s0
  sources: 
  services: cockpit dhcpv6-client ssh
  ports: 3389/tcp 7860/tcp
  protocols: 
  forward: yes
  masquerade: no
  forward-ports: 
  source-ports: 
  icmp-blocks: 
  rich rules: 

stable-diffusion-webui 실행

(.venv) [bluesanta@localhost stable-diffusion-webui]$ ./webui.sh
   
Loading weights [6ce0161689] from /home/bluesanta/Applications/stable-diffusion/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors
Running on local URL:  http://0.0.0.0:7860
728x90
728x90

출처

WSL 머신 상태

PS C:\Users\bluesanta> wsl -l -v
  NAME              STATE           VERSION
* docker-desktop    Stopped         2
  Ubuntu            Running         2

WSL 머신 종료

PS C:\Users\bluesanta> wsl -t Ubuntu
작업을 완료했습니다.

WSL 머신 재설치

기존 설치된 머신 삭제

PS C:\Users\bluesanta> wsl --unregister Ubuntu
등록 취소 중입니다.
작업을 완료했습니다.
PS C:\Users\bluesanta> wsl -l -v
  NAME              STATE           VERSION
* docker-desktop    Stopped         2

WSL 배포판 확인

PS C:\Users\bluesanta> wsl -l -o
다음은 설치할 수 있는 유효한 배포판 목록입니다.
'wsl.exe --install '를 사용하여 설치합니다.

NAME                            FRIENDLY NAME
Ubuntu                          Ubuntu
Debian                          Debian GNU/Linux
kali-linux                      Kali Linux Rolling
Ubuntu-18.04                    Ubuntu 18.04 LTS
Ubuntu-20.04                    Ubuntu 20.04 LTS
Ubuntu-22.04                    Ubuntu 22.04 LTS
Ubuntu-24.04                    Ubuntu 24.04 LTS
OracleLinux_7_9                 Oracle Linux 7.9
OracleLinux_8_7                 Oracle Linux 8.7
OracleLinux_9_1                 Oracle Linux 9.1
openSUSE-Leap-15.6              openSUSE Leap 15.6
SUSE-Linux-Enterprise-15-SP5    SUSE Linux Enterprise 15 SP5
SUSE-Linux-Enterprise-15-SP6    SUSE Linux Enterprise 15 SP6
openSUSE-Tumbleweed             openSUSE Tumbleweed

WSL 리눅스 설치

PS C:\Users\bluesanta> wsl --install Ubuntu-22.04
설치 중: Ubuntu 22.04 LTS
Ubuntu 22.04 LTS이(가) 설치되었습니다.
Ubuntu 22.04 LTS을(를) 시작하는 중...
Installing, this may take a few minutes...
Please create a default UNIX user account. The username does not need to match your Windows username.
For more information visit: https://aka.ms/wslusers
Enter new UNIX username: bluesanta
New password:
Retype new password:
passwd: password updated successfully
Installation successful!
To run a command as administrator (user "root"), use "sudo ".
See "man sudo_root" for details.
 
Welcome to Ubuntu 22.04.5 LTS (GNU/Linux 5.15.167.4-microsoft-standard-WSL2 x86_64)
 
 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/pro
 
 System information as of Sun Feb 16 15:54:18 KST 2025
 
  System load:  0.0                 Processes:             81
  Usage of /:   0.1% of 1006.85GB   Users logged in:       0
  Memory usage: 6%                  IPv4 address for eth0: 172.27.90.206
  Swap usage:   0%
 
 
This message is shown once a day. To disable it please create the
/home/bluesanta/.hushlogin file.
bluesanta@DESKTOP-2NKL9OB:~$

설치 확인

PS C:\Users\bluesanta> wsl -l -v
  NAME              STATE           VERSION
* docker-desktop    Running         2
  Ubuntu-22.04      Running         2

Python 3.11 설치

기존에 설치된 Python 설치 버전 확인

bluesanta@DESKTOP-2NKL9OB:~$ python3 --version
Python 3.10.12

기존에 설치된 Python 삭제

bluesanta@DESKTOP-2NKL9OB:~$ sudo apt remove python3
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt autoremove

Python 3.11 빌드에 필요한 패키지 설치

bluesanta@DESKTOP-2NKL9OB:~$ sudo apt update
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt install build-essential zlib1g-dev libncurses5-dev libgdbm-dev libnss3-dev libssl-dev libreadline-dev libffi-dev libsqlite3-dev wget libbz2-dev pkg-config lzma liblzma-dev cmake build-essential libx11-dev libxext-dev libxft-dev libxss-dev libxrender-dev libfontconfig1-dev libxinerama-dev libxrandr-dev libxcursor-dev libminizip-dev libbz2-dev liblzma-dev libzstd-dev libssl-dev zip unzip

tcl8.6 설치

bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ tar xvf tcl8.6.16-src.tar.gz
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ cd tcl8.6.16/unix/
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama/tcl8.6.16/unix$ ./configure --prefix=/usr/local
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama/tcl9.0.1/unix$ make -j$(nproc)
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama/tcl9.0.1/unix$ sudo make install

tk8.6 설치

bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ tar xvf tk8.6.16-src.tar.gz
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ cd tk8.6.16/unix/
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama/tk8.6.16/unix$ ./configure --prefix=/usr/local --with-tcl=/usr/local/lib
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama/tk8.6.16/unix$ make -j$(nproc)
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama/tk8.6.16/unix$ sudo make install

Python 3.11 소스 다운로드

bluesanta@DESKTOP-2NKL9OB:~$ wget https://www.python.org/ftp/python/3.11.11/Python-3.11.11.tgz

Python 3.11 소스 압축풀기

bluesanta@DESKTOP-2NKL9OB:~$ tar -xf Python-3.11.11.tgz

configure 실행

bluesanta@DESKTOP-2NKL9OB:~$ cd Python-3.11.11
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ export TCLTK_CFLAGS="-I/usr/local/include"
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ export TCLTK_LIBS="-L/usr/local/lib -ltcl8.6 -ltk8.6"
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ ./configure --enable-optimizations

Python 3.11 빌드

bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ grep -c processor /proc/cpuinfo
28
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ make -j$(nproc)
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ sudo make install

tkinter 확인

bluesanta@DESKTOP-PAV8U75:~/Python-3.11.11$ python3 -m tkinter

Python 3.11 설치 확인

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ python3 --version
Python 3.11.11

pip 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ wget https://bootstrap.pypa.io/get-pip.py
bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ python3 get-pip.py
Defaulting to user installation because normal site-packages is not writeable
Collecting pip
  Downloading pip-25.0.1-py3-none-any.whl.metadata (3.7 kB)
Collecting wheel
  Downloading wheel-0.45.1-py3-none-any.whl.metadata (2.3 kB)
Downloading pip-25.0.1-py3-none-any.whl (1.8 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 708.7 kB/s eta 0:00:00
Downloading wheel-0.45.1-py3-none-any.whl (72 kB)
Installing collected packages: wheel, pip
  WARNING: The script wheel is installed in '/home/bluesanta/.local/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
  WARNING: The scripts pip, pip3 and pip3.11 are installed in '/home/bluesanta/.local/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed pip-25.0.1 wheel-0.45.1
 
[notice] A new release of pip is available: 24.0 -> 25.0.1
[notice] To update, run: pip3 install --upgrade pip
bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ pip3 --version
pip 25.0.1 from /home/bluesanta/.local/lib/python3.11/site-packages/pip (python 3.11)

CUDA Toolkit & cuDNN 설치

CUDA Toolkit 12,1 설치

bluesanta@DESKTOP-2NKL9OB:~$ wget https://developer.download.nvidia.com/compute/cuda/12.1.0/local_installers/cuda_12.1.0_530.30.02_linux.run
bluesanta@DESKTOP-2NKL9OB:~$ chmod +x cuda_12.1.0_530.30.02_linux.run
bluesanta@DESKTOP-2NKL9OB:~$ sudo sh cuda_12.1.0_530.30.02_linux.run --toolkit --silent --driver
bluesanta@DESKTOP-2NKL9OB:~$ ls /usr/local/ | grep cuda
cuda
cuda-12.1

CUDA Toolkit 11.8 설치

bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ wget https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo mv cuda-wsl-ubuntu.pin /etc/apt/preferences.d/cuda-repository-pin-600
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ wget https://developer.download.nvidia.com/compute/cuda/11.8.0/local_installers/cuda-repo-wsl-ubuntu-11-8-local_11.8.0-1_amd64.deb
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo dpkg -i cuda-repo-wsl-ubuntu-11-8-local_11.8.0-1_amd64.deb
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo cp /var/cuda-repo-wsl-ubuntu-11-8-local/cuda-*-keyring.gpg /usr/share/keyrings/
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo apt update
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo apt -y install cuda
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ ls /usr/local/ | grep cuda
cuda
cuda-11
cuda-11.8

cuDNN 9.8 설치

bluesanta@DESKTOP-2NKL9OB:~$ wget https://developer.download.nvidia.com/compute/cudnn/9.8.0/local_installers/cudnn-local-repo-ubuntu2204-9.8.0_1.0-1_amd64.deb
bluesanta@DESKTOP-2NKL9OB:~$ sudo dpkg -i cudnn-local-repo-ubuntu2204-9.8.0_1.0-1_amd64.deb
bluesanta@DESKTOP-2NKL9OB:~$ sudo cp /var/cudnn-local-repo-ubuntu2204-9.8.0/cudnn-*-keyring.gpg /usr/share/keyrings/
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt update
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt-get -y install cudnn

cuDNN 8.6 설치

bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ tar -xvf cudnn-linux-x86_64-8.6.0.163_cuda11-archive.tar.xz
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo cp cudnn-linux-x86_64-8.6.0.163_cuda11-archive/include/cudnn*.h /usr/local/cuda-11.8/include
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo cp -P cudnn-linux-x86_64-8.6.0.163_cuda11-archive/lib/libcudnn* /usr/local/cuda-11.8/lib64
bluesanta@DESKTOP-PAV8U75:/mnt/c/Llama$ sudo chmod a+r /usr/local/cuda-11.8/lib64/libcudnn*

~/.bashrc 내용 추가

bluesanta@DESKTOP-2NKL9OB:~$ vi ~/.bashrc
export CUDA_PATH=/usr/local/cuda-12.1
export CUDA_HOME=/usr/local/cuda-12.1
export CUDADIR=/usr/local/cuda-12.1
export PATH=$PATH:$CUDADIR/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CUDADIR/lib64

# export CUDNN_PATH=$(dirname $(python -c "import nvidia.cudnn;print(nvidia.cudnn.__file__)"))
# export CUDNN_PATH=/mnt/c/Llama/.venv_wsl/lib/python3.11/site-packages/nvidia/cudnn
# export LD_LIBRARY_PATH=$CUDNN_PATH/lib:$CONDA_PREFIX/lib/:$LD_LIBRARY_PATH

# export HF_HOME=/mnt/c/Llama/huggingface

~/.bashrc 내용 추가

bluesanta@DESKTOP-2NKL9OB:~$ vi ~/.bashrc
export CUDA_PATH=/usr/local/cuda-11
export CUDA_HOME=/usr/local/cuda-11
export CUDADIR=/usr/local/cuda-11
export PATH=$PATH:$CUDADIR/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CUDADIR/lib64

# export CUDNN_PATH=$(dirname $(python -c "import nvidia.cudnn;print(nvidia.cudnn.__file__)"))
# export CUDNN_PATH=/mnt/c/Llama/.venv_wsl/lib/python3.11/site-packages/nvidia/cudnn
# export LD_LIBRARY_PATH=/usr/local/lib:$CUDNN_PATH/lib:$CONDA_PREFIX/lib/:$LD_LIBRARY_PATH

export HF_HOME=/mnt/c/Llama/huggingface

CUDA Toolkit 설치 확인

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Tue_Feb__7_19:32:13_PST_2023
Cuda compilation tools, release 12.1, V12.1.66
Build cuda_12.1.r12.1/compiler.32415258_0

tensorflow gpu 설치

$ pip install tensorrt==8.6.1
$ pip install tensorflow[and-cuda]==2.15

faiss-gpu-cu12 설치

bluesanta@DESKTOP-2NKL9OB:~$ pip install faiss-gpu-cu12

ollama 설치

ollama 설치 스크립트 다운로드 및 설치

bluesanta@DESKTOP-2NKL9OB:~$ curl https://ollama.ai/install.sh | sh
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 13269    0 13269    0     0  34377      0 --:--:-- --:--:-- --:--:-- 34375
>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> Nvidia GPU detected.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.

ollama 서비스 실행 확인

bluesanta@DESKTOP-2NKL9OB:~$ sudo systemctl status ollama.service
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
     Active: active (running) since Sun 2025-02-16 20:56:53 KST; 17s ago
   Main PID: 1766 (ollama)
      Tasks: 11 (limit: 9345)
     Memory: 38.8M
     CGroup: /system.slice/ollama.service
             └─1766 /usr/local/bin/ollama serve

Feb 16 20:56:53 DESKTOP-2NKL9OB systemd[1]: Started Ollama Service.
Feb 16 20:56:53 DESKTOP-2NKL9OB ollama[1766]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new priv>

ollama 모델 경로 변경

ollama.service 내용 추가

bluesanta@DESKTOP-2NKL9OB:~$ sudo vi /etc/systemd/system/ollama.service
Environment="OLLAMA_MODELS=/mnt/c/Llama/Models/Ollama/models"

ollama 서비스 재실행

bluesanta@DESKTOP-2NKL9OB:~$ sudo systemctl daemon-reload
bluesanta@DESKTOP-2NKL9OB:~$ sudo systemctl restart ollama.service

Python 가상 환경 만들기

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ python3 -m venv .venv_wsl
bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ source .venv_wsl/bin/activate
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$

langserve_ollama 관련 패키지 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ pip install fastapi uvicorn langchain_core langserve langchain_ollama langchain_text_splitters langchain_community sse_starlette pdfplumber faiss-gpu-cu12

huggingface-hub 패키지 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ pip install huggingface-hub

ollama 모델 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ huggingface-cli download LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct-GGUF EXAONE-3.5-7.8B-Instruct-BF16.gguf --local-dir /mnt/c/Llama/Models/Ollama/gguf --local-dir-use-symlinks False
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ cd Models/Ollama/gguf/
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama/Models/Ollama/gguf$ ollama create exaone -f EXAONE-3.5-7.8B-Instruct-BF16/Modelfile
728x90
728x90

출처

ComfyUI 다운로드

C:\stable-diffusion>git clone https://github.com/comfyanonymous/ComfyUI.git

Python 가상환경만들기

가상환경 생성

C:\stable-diffusion>cd ComfyUI
C:\stable-diffusion\ComfyUI>python -m venv venvIdmVton

가상환경 활성화

C:\stable-diffusion\ComfyUI>venvIdmVton\Scripts\activate
(venvIdmVton) C:\stable-diffusion\ComfyUI>

pip 업그레이드

(venvIdmVton) C:\stable-diffusion\ComfyUI>python.exe -m pip install --upgrade pip

PyTouch 설치

(venvIdmVton) C:\stable-diffusion\ComfyUI>pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu121

ComfyUI 관련 패키지 설치

(venvIdmVton) C:\stable-diffusion\ComfyUI>pip install -r requirements.txt

ComfyUI 실행

(venvIdmVton) C:\stable-diffusion\ComfyUI>python main.py
Total VRAM 8188 MB, total RAM 16092 MB
pytorch version: 2.4.0+cu121
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4060 Laptop GPU : cudaMallocAsync
Using pytorch cross attention
[Prompt Server] web root: C:\stable-diffusion\ComfyUI\web
C:\stable-diffusion\ComfyUI\venvIdmVton\Lib\site-packages\kornia\feature\lightglue.py:44: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead.
  @torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)

Import times for custom nodes:
   0.0 seconds: C:\stable-diffusion\ComfyUI\custom_nodes\websocket_image_save.py

Starting server

To see the GUI go to: http://127.0.0.1:8188

ComfyUI Manager 설치

(venvIdmVton) C:\stable-diffusion\ComfyUI>cd custom_nodes
(venvIdmVton) C:\stable-diffusion\ComfyUI\custom_nodes>git clone https://github.com/ltdrdata/ComfyUI-Manager.git

ComfyUI 재실행

(venvIdmVton) C:\stable-diffusion\ComfyUI>python main.py
[START] Security scan
[DONE] Security scan
## ComfyUI-Manager: installing dependencies. (GitPython)
## ComfyUI-Manager: installing dependencies done.
** ComfyUI startup time: 2024-07-27 23:54:20.182361
** Platform: Windows
** Python version: 3.11.0 (main, Oct 24 2022, 18:26:48) [MSC v.1933 64 bit (AMD64)]
** Python executable: C:\stable-diffusion\ComfyUI\venvIdmVton\Scripts\python.exe
** ComfyUI Path: C:\stable-diffusion\ComfyUI
** Log path: C:\stable-diffusion\ComfyUI\comfyui.log

Prestartup times for custom nodes:
  18.1 seconds: C:\stable-diffusion\ComfyUI\custom_nodes\ComfyUI-Manager

Total VRAM 8188 MB, total RAM 16092 MB
pytorch version: 2.4.0+cu121
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4060 Laptop GPU : cudaMallocAsync
Using pytorch cross attention
[Prompt Server] web root: C:\stable-diffusion\ComfyUI\web
C:\stable-diffusion\ComfyUI\venvIdmVton\Lib\site-packages\kornia\feature\lightglue.py:44: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead.
  @torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
### Loading: ComfyUI-Manager (V2.48.2)
### ComfyUI Revision: 2413 [f82d09c9] | Released on '2024-07-27'

Import times for custom nodes:
   0.0 seconds: C:\stable-diffusion\ComfyUI\custom_nodes\websocket_image_save.py
   0.3 seconds: C:\stable-diffusion\ComfyUI\custom_nodes\ComfyUI-Manager

Starting server

To see the GUI go to: http://127.0.0.1:8188
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
728x90
728x90

출처

CUDA 12.1 설치

주소 : https://developer.nvidia.com/cuda-12-1-0-download-archive

GPU사용량 모니터링 (nvidia-smi 명령)

C:\stable-diffusion>nvidia-smi
Sat Jul 27 21:15:49 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 546.30                 Driver Version: 546.30       CUDA Version: 12.3     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                     TCC/WDDM  | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA GeForce RTX 4060 ...  WDDM  | 00000000:01:00.0 Off |                  N/A |
| N/A   38C    P8               1W /  80W |      0MiB /  8188MiB |      0%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+

+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|    0   N/A  N/A     14808    C+G   ...8bbwe\SnippingTool\SnippingTool.exe    N/A      |
+---------------------------------------------------------------------------------------+

CUDA 버전 확인

C:\stable-diffusion>nvcc -V
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Wed_Feb__8_05:53:42_Coordinated_Universal_Time_2023
Cuda compilation tools, release 12.1, V12.1.66
Build cuda_12.1.r12.1/compiler.32415258_0

PyTorch 설치

C:\stable-diffusion>pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu121
Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu121
Collecting torch
  Downloading https://download.pytorch.org/whl/cu121/torch-2.4.0%2Bcu121-cp311-cp311-win_amd64.whl (2442.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 GB 2.2 MB/s eta 0:00:00
Collecting torchvision
  Downloading https://download.pytorch.org/whl/cu121/torchvision-0.19.0%2Bcu121-cp311-cp311-win_amd64.whl (5.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.8/5.8 MB 26.5 MB/s eta 0:00:00
Collecting torchaudio
  Downloading https://download.pytorch.org/whl/cu121/torchaudio-2.4.0%2Bcu121-cp311-cp311-win_amd64.whl (4.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.1/4.1 MB 32.8 MB/s eta 0:00:00
Collecting filelock
  Downloading filelock-3.15.4-py3-none-any.whl (16 kB)
Collecting typing-extensions>=4.8.0
  Downloading typing_extensions-4.12.2-py3-none-any.whl (37 kB)
Collecting sympy
  Downloading sympy-1.13.1-py3-none-any.whl (6.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.2/6.2 MB 13.2 MB/s eta 0:00:00
Collecting networkx
  Downloading networkx-3.3-py3-none-any.whl (1.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 21.4 MB/s eta 0:00:00
Collecting jinja2
  Downloading jinja2-3.1.4-py3-none-any.whl (133 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.3/133.3 kB 7.7 MB/s eta 0:00:00
Collecting fsspec
  Downloading fsspec-2024.6.1-py3-none-any.whl (177 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 177.6/177.6 kB ? eta 0:00:00
Collecting numpy
  Downloading numpy-2.0.1-cp311-cp311-win_amd64.whl (16.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.6/16.6 MB 17.7 MB/s eta 0:00:00
Collecting pillow!=8.3.*,>=5.3.0
  Downloading pillow-10.4.0-cp311-cp311-win_amd64.whl (2.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.6/2.6 MB 18.0 MB/s eta 0:00:00
Collecting MarkupSafe>=2.0
  Downloading https://download.pytorch.org/whl/MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl (17 kB)
Collecting mpmath<1.4,>=1.1.0
  Downloading https://download.pytorch.org/whl/mpmath-1.3.0-py3-none-any.whl (536 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 536.2/536.2 kB 16.4 MB/s eta 0:00:00
Installing collected packages: mpmath, typing-extensions, sympy, pillow, numpy, networkx, MarkupSafe, fsspec, filelock, jinja2, torch, torchvision, torchaudio
Successfully installed MarkupSafe-2.1.5 filelock-3.15.4 fsspec-2024.6.1 jinja2-3.1.4 mpmath-1.3.0 networkx-3.3 numpy-2.0.1 pillow-10.4.0 sympy-1.13.1 torch-2.4.0+cu121 torchaudio-2.4.0+cu121 torchvision-0.19.0+cu121 typing-extensions-4.12.2
WARNING: There was an error checking the latest version of pip.

PyTorch 설치 버전 확인

C:\stable-diffusion>pip show torch
Name: torch
Version: 2.4.0+cu121
Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration
Home-page: https://pytorch.org/
Author: PyTorch Team
Author-email: packages@pytorch.org
License: BSD-3
Location: C:\Server\python-3.11.9-cuda121\Lib\site-packages
Requires: filelock, fsspec, jinja2, networkx, sympy, typing-extensions
Required-by: torchaudio, torchvision
728x90

+ Recent posts