Replies: 2 comments
-
|
Are you running ChaiNNer on a GPU? 4 vs 168 hours sounds like a huge difference if done on the same GPU with the same underlying ONNX library. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
I wouldn't recommend using tensorrt with chaiNNer. Use PyTorch or just cuda with onnx. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Every model runs much more slowly in chaiNNer than it does outside of the application. Most recently, I tried running an onnx model with TensorRT; the file took 4 hours to process outside of chainner and 7 days to process in chainner.... Why is chainner so slow?
Beta Was this translation helpful? Give feedback.
All reactions