CUDA error with AMD GPU? #7867
mikeperalta1
started this conversation in
General
Replies: 1 comment 3 replies
-
Same here, commenting to bump I am trying to run it on an AMD RX7700XT Logs below
|
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey all,
Trying to figure out what I'm doing wrong. Trying to run llama with an AMD GPU (6600XT) spits out a confusing error, as I don't have an NVIDIA GPU:
It does detect my GPU:
Currently building with
make LLAMA_HIPBLAS=1 LLAMA_HIP_UMA=1 AMDGPU_TARGETS=gfx1032 -j8
but I've tried other variants to no avail. Here's my build script:Clearly I'm just brute-forcing and don't know how this is supposed to work. Any advice?
Beta Was this translation helpful? Give feedback.
All reactions