Sign In

Linux Debian(ubuntu) Guide for ROCm

Linux Debian(ubuntu) Guide for ROCm

update: for people who are waiting on windows, it is unlikely they will support older versions, and the probability of the rest on the list at windows support listed being supported is slim, because they are gonna drop rocm in 2-3 years when they release the 8000 series.

You start by doing this

https://rocm.docs.amd.com/projects/install-on-linux/en/latest/how-to/native-install/ubuntu.html

sudo apt update

sudo apt install git python3-pip python3 python-is-python3 python3.11-venv libstdc++-12-dev wget amdgpu base-devel google-perftools && sudo apt install amdgpu-dkms rocm-hip-libraries

Now add your user to the render group (replace username with your own).

sudo adduser username render

sudo adduser username video

Edit ~/.profile with your editor of choice.

sudo nano ~/.profile

Paste the following line at the bottom of the file, then press ctrl-x and save the file before exiting.

export HSA_OVERRIDE_GFX_VERSION=10.3.0

(change the version number to 11.0.0 for 7000 series)

Now make sure to restart your computer before continuing.

after reboot:

git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui

cd stable-diffusion-webui

 

python3 -m venv venv

before hand to skip some pain in having to reinstall the pytorch

now activate the venv using

. venv/bin/activate

you will now see "(venv)" beside the front of console

we want to seperate these two commands so that it will install in necessary order for it to install "WHL"

pip3 install wheel

THEN

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/rocm5.7

Let's verify PyTorch was installed correctly with GPU support, so lets first enter the Python console.

python3

Now enter the following two lines of code. If it returns True then everything was installed correctly.

import torch

torch.cuda.is_available()

Then enter exit() to exit the Python console.

finally, if you are a update maniac like me, you CAN update to the latest and it still work, simply refer to step 1 (also if you close or disconnect wifi at any of the downloads you have a chance at risking leaving fragments of broken code because some files dont download and it will result in a error)

now from step one copy the https://repo.radeon.com/amdgpu-install/ select latest, then ubuntu (it is debians most popular variant also the default name for deb files) and finally jammy, download the deb file in there, then run the same command used to install 5.4.2 to install 5.6 changing the deb name it will receive the same _apt error but again, this is just for changing versions, note that we cannot however update the pytorch rocm though as it is currently only supported under 5.4.2 however it still works with 5.6 as long as you have successfully installed 5.4.2 rocm pytorch

when putting 5.6 install it should ask to update the repositories, select yes, and afterwards go to your sources and find the one repository that says @AMDGPUDRIVER@ where the rest say 5.6 and change @AMDGPUDRIVER@ to the word latest it should look something like this for what we change deb https://repo.radeon.com/amdgpu/latest/ubuntu jammy proprietary with the latest having the @AMDGPUDRIVER@ that we changed and enable them all and update the cache using sudo apt update now go to AMD your software updates and you can update everything 

sudo reboot

also if you do that you can install latest rocm pytorch module in stable diffusion (confirmed working in the stable diffusion used in guide, make sure to backup the venv folder incase you make a mistake)

first backup the venv folder into a seperate directory then do this:

python -m venv venv

pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.0

after that finishes run webui.sh remember first gen takes a minute

If you would like to help bring xformers to AMD please refer to this issue and participate testing, make sure to include your specs https://github.com/facebookresearch/xformers/pull/978

If you need further assistance feel free to ask me @hina in my discord https://discord.com/invite/cYA4XT6vGH

5

Comments