Replies: 3 comments
-
|
Are you using a podman or docker machine or trying to run it locally? |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@slp Do you think there could be an issue with vulkan? |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Issue was resolves. Its possible to set the memory limits of Vulkan. It was reporting half the memory available (which must be standard). After updating it the model loaded. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Im currently getting an error loading models with large context sizes even though Im running a Mac Studio with 64 GB memory. Ive read an inherent limiitation about vulkan, is this true? I seem to be able to run gemma 12b but anything large I cant.
Beta Was this translation helpful? Give feedback.
All reactions