Naive Question: Can we use CPU instead of GPU during inference? #1531
Closed
liuyifan22
started this conversation in
General
Replies: 1 comment 1 reply
-
Check out llava llamafile -ngl 0 runs on cpu entirely |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello! I am a newcomer in this field, and is wondering whether cpu can be used during inference. If so, I'd like to build one on my own computer which has no gpus for daily use. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions