Ollama not using AMD GPU on Arch Linux [Fixed]
from autonomoususer@lemmy.world to selfhosted@lemmy.world on 20 Mar 13:21
https://lemmy.world/post/27088699

cross-posted from: lemmy.world/post/27088416

This is an update to a previous post found at lemmy.world/post/27013201


Ollama uses the AMD ROCm library which works well with many AMD GPUs not listed as compatible by forcing an LLVM target.

The original Ollama documentation is wrong as the following can not be set for individual GPUs, only all or none, as shown at github.com/ollama/ollama/issues/8473

AMD GPU issue fix

  1. Check your GPU is not already listed as compatibility at github.com/ollama/ollama/blob/main/docs/gpu.md#linux-support
  2. Edit the Ollama service file. This uses the text editor set in the $SYSTEMD_EDITOR environment variable.
sudo systemctl edit ollama.service
  1. Add the following, save and exit. You can try different versions as shown at github.com/ollama/ollama/blob/main/docs/gpu.md#overrides-on-linux
[Service]
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
  1. Restart the Ollama service.
sudo systemctl restart ollama

#selfhosted

threaded - newest

possiblylinux127@lemmy.zip on 20 Mar 20:42 collapse

I would run it in a Podman container with the GPU passed though

30p87@feddit.org on 20 Mar 22:33 collapse

Why not throw that into a VM with VFIO passthrough, plug the GPU in via an external dock and if we are already at abstracting shit away for unnecessary complexity and non-compatibility do all that on windows?

exu@feditown.com on 20 Mar 22:58 next collapse

Nested VMs stay performant about three levels deep, so do that as well.

possiblylinux127@lemmy.zip on 21 Mar 00:18 collapse

Because that is way more complicated?

It is really easy to run ollama in a container.

30p87@feddit.org on 21 Mar 06:22 collapse

Really easy to start running it

Then everything goes wrong, from configuration over logs to cuda. And the worst fucking debugging ever.

possiblylinux127@lemmy.zip on 21 Mar 13:02 collapse

On Linux you can download Alpaca. I think it is CPU only but it is simpler.

30p87@feddit.org on 21 Mar 13:14 collapse

Ollama is simple too, I meant that containers make everything a nightmare to maintain.

possiblylinux127@lemmy.zip on 21 Mar 16:39 collapse

If containers are the hard part you are doing it wrong. Containers should make it much easier.