OutOfMemoryError when running gpt2 on T4 GPU #287
Unanswered
zhaoyang-star
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I tried running
benchmark_throughput.py
using hf backend on T4. T4 has 16GB global memory and gpt2 only has 124M params. So there is enough memory space for gpt2. But the error happend as following. Why tried to allocate 8.46 GiB?Beta Was this translation helpful? Give feedback.
All reactions