Is there a separate Olive/ONNX optimization script? #353
-
Olive/ONNX optimization requires a lot of RAM and VRAM. My device can run the WebUI just fine with directml, but can't optimize checkpoints due to insufficient memory. So I want to use Google Colab just to optimize the checkpoint, but Colab prohibits WebUI and automatically terminates it. Is there a separate script for the optimization, perhaps? So I can use it in Colab. By the way, I was assuming running the optimized checkpoint with Olive doesn't require significantly more RAM than normal checkpoint with just directml. Or does it? Thank you |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
You have two options.
|
Beta Was this translation helpful? Give feedback.
You have two options.