출처
- 14.3. pip - 우분투(Ubuntu)와 WSL
- WSL 리눅스 시작/종료/재부팅 방법 :: LainyZine 테크 블로그
- 방법 1: WSL2 Ubuntu 환경에서 CUDA Toolkit 직접 설치 - WSL2: Windows, Ubuntu, Docker, TensorFlow2, 그리고 Remote Desktop을 이용한 서버-클라이언트 연결 실전 가이드
- How to Install Python 3.11 on Ubuntu 22.04
- WSL2로 CUDA 환경 설정하기 (CUDA+cuDNN 설치까지) :: cjkangme.log
WSL 머신 상태
NAME STATE VERSION
* docker-desktop Stopped 2
Ubuntu Running 2
WSL 머신 종료
작업을 완료했습니다.
WSL 머신 재설치
기존 설치된 머신 삭제
등록 취소 중입니다.
작업을 완료했습니다.
NAME STATE VERSION
* docker-desktop Stopped 2
WSL 배포판 확인
다음은 설치할 수 있는 유효한 배포판 목록입니다.
'wsl.exe --install '를 사용하여 설치합니다.
NAME FRIENDLY NAME
Ubuntu Ubuntu
Debian Debian GNU/Linux
kali-linux Kali Linux Rolling
Ubuntu-18.04 Ubuntu 18.04 LTS
Ubuntu-20.04 Ubuntu 20.04 LTS
Ubuntu-22.04 Ubuntu 22.04 LTS
Ubuntu-24.04 Ubuntu 24.04 LTS
OracleLinux_7_9 Oracle Linux 7.9
OracleLinux_8_7 Oracle Linux 8.7
OracleLinux_9_1 Oracle Linux 9.1
openSUSE-Leap-15.6 openSUSE Leap 15.6
SUSE-Linux-Enterprise-15-SP5 SUSE Linux Enterprise 15 SP5
SUSE-Linux-Enterprise-15-SP6 SUSE Linux Enterprise 15 SP6
openSUSE-Tumbleweed openSUSE Tumbleweed
WSL 리눅스 설치
설치 중: Ubuntu 22.04 LTS
Ubuntu 22.04 LTS이(가) 설치되었습니다.
Ubuntu 22.04 LTS을(를) 시작하는 중...
Installing, this may take a few minutes...
Please create a default UNIX user account. The username does not need to match your Windows username.
For more information visit: https://aka.ms/wslusers
Enter new UNIX username: bluesanta
New password:
Retype new password:
passwd: password updated successfully
Installation successful!
To run a command as administrator (user "root"), use "sudo ".
See "man sudo_root" for details.
Welcome to Ubuntu 22.04.5 LTS (GNU/Linux 5.15.167.4-microsoft-standard-WSL2 x86_64)
* Documentation: https://help.ubuntu.com
* Management: https://landscape.canonical.com
* Support: https://ubuntu.com/pro
System information as of Sun Feb 16 15:54:18 KST 2025
System load: 0.0 Processes: 81
Usage of /: 0.1% of 1006.85GB Users logged in: 0
Memory usage: 6% IPv4 address for eth0: 172.27.90.206
Swap usage: 0%
This message is shown once a day. To disable it please create the
/home/bluesanta/.hushlogin file.
설치 확인
NAME STATE VERSION
* docker-desktop Running 2
Ubuntu-22.04 Running 2
Python 3.11 설치
기존에 설치된 Python 설치 버전 확인
Python 3.10.12
기존에 설치된 Python 삭제
Python 3.11 빌드에 필요한 패키지 설치
tcl8.6 설치
tk8.6 설치
Python 3.11 소스 다운로드
Python 3.11 소스 압축풀기
configure 실행
Python 3.11 빌드
28
tkinter 확인
Python 3.11 설치 확인
Python 3.11.11
pip 설치
Defaulting to user installation because normal site-packages is not writeable
Collecting pip
Downloading pip-25.0.1-py3-none-any.whl.metadata (3.7 kB)
Collecting wheel
Downloading wheel-0.45.1-py3-none-any.whl.metadata (2.3 kB)
Downloading pip-25.0.1-py3-none-any.whl (1.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 708.7 kB/s eta 0:00:00
Downloading wheel-0.45.1-py3-none-any.whl (72 kB)
Installing collected packages: wheel, pip
WARNING: The script wheel is installed in '/home/bluesanta/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The scripts pip, pip3 and pip3.11 are installed in '/home/bluesanta/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed pip-25.0.1 wheel-0.45.1
[notice] A new release of pip is available: 24.0 -> 25.0.1
[notice] To update, run: pip3 install --upgrade pip
pip 25.0.1 from /home/bluesanta/.local/lib/python3.11/site-packages/pip (python 3.11)
CUDA Toolkit & cuDNN 설치
CUDA Toolkit 12,1 설치
cuda
cuda-12.1
CUDA Toolkit 11.8 설치
cuda
cuda-11
cuda-11.8
cuDNN 9.8 설치
cuDNN 8.6 설치
~/.bashrc 내용 추가
export CUDA_PATH=/usr/local/cuda-12.1
export CUDA_HOME=/usr/local/cuda-12.1
export CUDADIR=/usr/local/cuda-12.1
export PATH=$PATH:$CUDADIR/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CUDADIR/lib64
# export CUDNN_PATH=$(dirname $(python -c "import nvidia.cudnn;print(nvidia.cudnn.__file__)"))
# export CUDNN_PATH=/mnt/c/Llama/.venv_wsl/lib/python3.11/site-packages/nvidia/cudnn
# export LD_LIBRARY_PATH=$CUDNN_PATH/lib:$CONDA_PREFIX/lib/:$LD_LIBRARY_PATH
# export HF_HOME=/mnt/c/Llama/huggingface
~/.bashrc 내용 추가
export CUDA_PATH=/usr/local/cuda-11
export CUDA_HOME=/usr/local/cuda-11
export CUDADIR=/usr/local/cuda-11
export PATH=$PATH:$CUDADIR/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CUDADIR/lib64
# export CUDNN_PATH=$(dirname $(python -c "import nvidia.cudnn;print(nvidia.cudnn.__file__)"))
# export CUDNN_PATH=/mnt/c/Llama/.venv_wsl/lib/python3.11/site-packages/nvidia/cudnn
# export LD_LIBRARY_PATH=/usr/local/lib:$CUDNN_PATH/lib:$CONDA_PREFIX/lib/:$LD_LIBRARY_PATH
export HF_HOME=/mnt/c/Llama/huggingface
CUDA Toolkit 설치 확인
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Tue_Feb__7_19:32:13_PST_2023
Cuda compilation tools, release 12.1, V12.1.66
Build cuda_12.1.r12.1/compiler.32415258_0
tensorflow gpu 설치
faiss-gpu-cu12 설치
ollama 설치
ollama 설치 스크립트 다운로드 및 설치
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 13269 0 13269 0 0 34377 0 --:--:-- --:--:-- --:--:-- 34375
>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
####################################################################### 100.0%
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> Nvidia GPU detected.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
ollama 서비스 실행 확인
● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
Active: active (running) since Sun 2025-02-16 20:56:53 KST; 17s ago
Main PID: 1766 (ollama)
Tasks: 11 (limit: 9345)
Memory: 38.8M
CGroup: /system.slice/ollama.service
└─1766 /usr/local/bin/ollama serve
Feb 16 20:56:53 DESKTOP-2NKL9OB systemd[1]: Started Ollama Service.
Feb 16 20:56:53 DESKTOP-2NKL9OB ollama[1766]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new priv>
ollama 모델 경로 변경
ollama.service 내용 추가
Environment="OLLAMA_MODELS=/mnt/c/Llama/Models/Ollama/models"
ollama 서비스 재실행
Python 가상 환경 만들기
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$
langserve_ollama 관련 패키지 설치
huggingface-hub 패키지 설치
ollama 모델 설치
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama$ cd Models/Ollama/gguf/
(.venv_wsl) bluesanta@DESKTOP-2NKL9OB:/mnt/c/Llama/Models/Ollama/gguf$ ollama create exaone -f EXAONE-3.5-7.8B-Instruct-BF16/Modelfile