Replies: 3 comments
-
It does not make much practical sense, unless you have some very specific conditions or large batches, but you can try. Make sure to read this before proceeding: |
Beta Was this translation helpful? Give feedback.
-
![]() GPU inference would definitely be helpful :) I have a use case where I need to proceed multiple hours of sound captured by a wearable device. |
Beta Was this translation helpful? Give feedback.
-
@mironnn
The while inference chunk-wise with this code:
On 3-rd iteration i get:
Can you please help me understand why is this happening and how to fix it? Versions: |
Beta Was this translation helpful? Give feedback.
-
❓ Questions and Help
Hi.
Thank you for the provided repo and models.
Wiki says, that "Using batching or GPU can also improve performance considerably".
I've tried to run VAD models of GPU and faced errors.
Could I run jit or Onnx models on GPU?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions