티스토리 뷰

ComfyUI

WSL2에 Ollama 설치하기

파란크리스마스 2025. 2. 16. 14:51
728x90

출처

WSL 머신 상태

PS C:\Users\bluesanta> wsl -l -v
  NAME              STATE           VERSION
* docker-desktop    Stopped         2
  Ubuntu            Running         2

WSL 머신 종료

PS C:\Users\bluesanta> wsl -t Ubuntu
작업을 완료했습니다.

WSL 머신 재설치

기존 설치된 머신 삭제

PS C:\Users\bluesanta> wsl --unregister Ubuntu
등록 취소 중입니다.
작업을 완료했습니다.
PS C:\Users\bluesanta> wsl -l -v
  NAME              STATE           VERSION
* docker-desktop    Stopped         2

WSL 배포판 확인

PS C:\Users\bluesanta> wsl -l -o
다음은 설치할 수 있는 유효한 배포판 목록입니다.
'wsl.exe --install '를 사용하여 설치합니다.

NAME                            FRIENDLY NAME
Ubuntu                          Ubuntu
Debian                          Debian GNU/Linux
kali-linux                      Kali Linux Rolling
Ubuntu-18.04                    Ubuntu 18.04 LTS
Ubuntu-20.04                    Ubuntu 20.04 LTS
Ubuntu-22.04                    Ubuntu 22.04 LTS
Ubuntu-24.04                    Ubuntu 24.04 LTS
OracleLinux_7_9                 Oracle Linux 7.9
OracleLinux_8_7                 Oracle Linux 8.7
OracleLinux_9_1                 Oracle Linux 9.1
openSUSE-Leap-15.6              openSUSE Leap 15.6
SUSE-Linux-Enterprise-15-SP5    SUSE Linux Enterprise 15 SP5
SUSE-Linux-Enterprise-15-SP6    SUSE Linux Enterprise 15 SP6
openSUSE-Tumbleweed             openSUSE Tumbleweed

WSL 리눅스 설치

PS C:\Users\bluesanta> wsl --install Ubuntu-22.04
설치 중: Ubuntu 22.04 LTS
Ubuntu 22.04 LTS이(가) 설치되었습니다.
Ubuntu 22.04 LTS을(를) 시작하는 중...
Installing, this may take a few minutes...
Please create a default UNIX user account. The username does not need to match your Windows username.
For more information visit: https://aka.ms/wslusers
Enter new UNIX username: bluesanta
New password:
Retype new password:
passwd: password updated successfully
Installation successful!
To run a command as administrator (user "root"), use "sudo ".
See "man sudo_root" for details.
 
Welcome to Ubuntu 22.04.5 LTS (GNU/Linux 5.15.167.4-microsoft-standard-WSL2 x86_64)
 
 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/pro
 
 System information as of Sun Feb 16 15:54:18 KST 2025
 
  System load:  0.0                 Processes:             81
  Usage of /:   0.1% of 1006.85GB   Users logged in:       0
  Memory usage: 6%                  IPv4 address for eth0: 172.27.90.206
  Swap usage:   0%
 
 
This message is shown once a day. To disable it please create the
/home/bluesanta/.hushlogin file.
bluesanta@DESKTOP-2NKL9OB:~$

설치 확인

PS C:\Users\bluesanta> wsl -l -v
  NAME              STATE           VERSION
* docker-desktop    Running         2
  Ubuntu-22.04      Running         2

Python 3.11 설치

기존에 설치된 Python 설치 버전 확인

bluesanta@DESKTOP-2NKL9OB:~$ python3 --version
Python 3.10.12

기존에 설치된 Python 삭제

bluesanta@DESKTOP-2NKL9OB:~$ sudo apt remove python3
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt autoremove

Python 3.11 빌드에 필요한 패키지 설치

bluesanta@DESKTOP-2NKL9OB:~$ sudo apt update
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt install build-essential zlib1g-dev libncurses5-dev libgdbm-dev libnss3-dev libssl-dev libreadline-dev libffi-dev libsqlite3-dev wget libbz2-dev pkg-config lzma liblzma-dev

Python 3.11 소스 다운로드

bluesanta@DESKTOP-2NKL9OB:~$ wget https://www.python.org/ftp/python/3.11.11/Python-3.11.11.tgz

Python 3.11 소스 압축풀기

bluesanta@DESKTOP-2NKL9OB:~$ tar -xf Python-3.11.11.tgz

configure 실행

bluesanta@DESKTOP-2NKL9OB:~$ cd Python-3.11.11
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ ./configure --enable-optimizations

Python 3.11 빌드

bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ grep -c processor /proc/cpuinfo
28
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ make -j 28
bluesanta@DESKTOP-2NKL9OB:~/Python-3.11.11$ sudo make install

Python 3.11 설치 확인

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ python3 --version
Python 3.11.11

pip 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ wget https://bootstrap.pypa.io/get-pip.py
bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ python3 get-pip.py
Defaulting to user installation because normal site-packages is not writeable
Collecting pip
  Downloading pip-25.0.1-py3-none-any.whl.metadata (3.7 kB)
Collecting wheel
  Downloading wheel-0.45.1-py3-none-any.whl.metadata (2.3 kB)
Downloading pip-25.0.1-py3-none-any.whl (1.8 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 708.7 kB/s eta 0:00:00
Downloading wheel-0.45.1-py3-none-any.whl (72 kB)
Installing collected packages: wheel, pip
  WARNING: The script wheel is installed in '/home/bluesanta/.local/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
  WARNING: The scripts pip, pip3 and pip3.11 are installed in '/home/bluesanta/.local/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed pip-25.0.1 wheel-0.45.1
 
[notice] A new release of pip is available: 24.0 -> 25.0.1
[notice] To update, run: pip3 install --upgrade pip
bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ pip3 --version
pip 25.0.1 from /home/bluesanta/.local/lib/python3.11/site-packages/pip (python 3.11)

CUDA Toolkit & cuDNN 설치

CUDA Toolkit 설치

bluesanta@DESKTOP-2NKL9OB:~$ wget https://developer.download.nvidia.com/compute/cuda/12.1.0/local_installers/cuda_12.1.0_530.30.02_linux.run
bluesanta@DESKTOP-2NKL9OB:~$ chmod +x cuda_12.1.0_530.30.02_linux.run
bluesanta@DESKTOP-2NKL9OB:~$ sudo sh cuda_12.1.0_530.30.02_linux.run --toolkit --silent --driver
bluesanta@DESKTOP-2NKL9OB:~$ ls /usr/local/ | grep cuda
cuda
cuda-12.1

cuDNN 설치

bluesanta@DESKTOP-2NKL9OB:~$ wget https://developer.download.nvidia.com/compute/cudnn/9.8.0/local_installers/cudnn-local-repo-ubuntu2204-9.8.0_1.0-1_amd64.deb
bluesanta@DESKTOP-2NKL9OB:~$ sudo dpkg -i cudnn-local-repo-ubuntu2204-9.8.0_1.0-1_amd64.deb
bluesanta@DESKTOP-2NKL9OB:~$ sudo cp /var/cudnn-local-repo-ubuntu2204-9.8.0/cudnn-*-keyring.gpg /usr/share/keyrings/
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt update
bluesanta@DESKTOP-2NKL9OB:~$ sudo apt-get -y install cudnn

~/.bashrc 내용 추가

bluesanta@DESKTOP-2NKL9OB:~$ vi ~/.bashrc
export CUDA_PATH=/usr/local/cuda-12.1
export CUDA_HOME=/usr/local/cuda-12.1
export CUDADIR=/usr/local/cuda-12.1
export PATH=$PATH:$CUDADIR/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CUDADIR/lib64

# export CUDNN_PATH=$(dirname $(python -c "import nvidia.cudnn;print(nvidia.cudnn.__file__)"))
# export CUDNN_PATH=/mnt/c/Llama/.venv_wsl/lib/python3.11/site-packages/nvidia/cudnn
# export LD_LIBRARY_PATH=$CUDNN_PATH/lib:$CONDA_PREFIX/lib/:$LD_LIBRARY_PATH

# export HF_HOME=/mnt/c/Llama/huggingface

CUDA Toolkit 설치 확인

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Users/bluesanta$ nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Tue_Feb__7_19:32:13_PST_2023
Cuda compilation tools, release 12.1, V12.1.66
Build cuda_12.1.r12.1/compiler.32415258_0

tensorflow gpu 설치

$ pip install tensorrt==8.6.1
$ pip install tensorflow[and-cuda]==2.15

faiss-gpu-cu12 설치

bluesanta@DESKTOP-2NKL9OB:~$ pip install faiss-gpu-cu12

ollama 설치

ollama 설치 스크립트 다운로드 및 설치

bluesanta@DESKTOP-2NKL9OB:~$ curl https://ollama.ai/install.sh | sh
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 13269    0 13269    0     0  34377      0 --:--:-- --:--:-- --:--:-- 34375
>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> Nvidia GPU detected.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.

ollama 서비스 실행 확인

bluesanta@DESKTOP-2NKL9OB:~$ sudo systemctl status ollama.service
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
     Active: active (running) since Sun 2025-02-16 20:56:53 KST; 17s ago
   Main PID: 1766 (ollama)
      Tasks: 11 (limit: 9345)
     Memory: 38.8M
     CGroup: /system.slice/ollama.service
             └─1766 /usr/local/bin/ollama serve

Feb 16 20:56:53 DESKTOP-2NKL9OB systemd[1]: Started Ollama Service.
Feb 16 20:56:53 DESKTOP-2NKL9OB ollama[1766]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new priv>

ollama 모델 경로 변경

ollama.service 내용 추가

bluesanta@DESKTOP-2NKL9OB:~$ sudo vi /etc/systemd/system/ollama.service
Environment="OLLAMA_MODELS=/mnt/c/Llama/Models/Ollama/models"

ollama 서비스 재실행

bluesanta@DESKTOP-2NKL9OB:~$ sudo systemctl daemon-reload
bluesanta@DESKTOP-2NKL9OB:~$ sudo systemctl restart ollama.service

Python 가상 환경 만들기

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ python3 -m venv .venv_wsl
bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ source .venv_wsl/bin/activate
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$

langserve_ollama 관련 패키지 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ pip install fastapi uvicorn langchain_core langserve langchain_ollama langchain_text_splitters langchain_community sse_starlette pdfplumber faiss-gpu-cu12

huggingface-hub 패키지 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ pip install huggingface-hub

ollama 모델 설치

bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ huggingface-cli download LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct-GGUF EXAONE-3.5-7.8B-Instruct-BF16.gguf --local-dir /mnt/c/Llama/Models/Ollama/gguf --local-dir-use-symlinks False
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ cd Models/Ollama/gguf/
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama/Models/Ollama/gguf$ ollama create exaone -f EXAONE-3.5-7.8B-Instruct-BF16/Modelfile
댓글
300x250
공지사항
최근에 올라온 글
최근에 달린 댓글
Total
Today
Yesterday
«   2025/04   »
1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30
글 보관함