Export models to ONNX and TorchScript #2464
roaldarbol
started this conversation in
Ideas
Replies: 1 comment
-
|
Hey @roaldarbol, I'll chip into the discussion on the other thread but just FYI: I got a chance to connect with Nicholas Guilbeault (@ncguilbeault) here at SfN who suggested that torchscript might be the easiest path to integration. We'll likely be targeting that first and then working on ONNX. Talmo |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Heyo! Hope all is well in San Diego!
I'm currently working quite a bit with Bonsai and very soon SLEAP once again, as I'm planning on using my SLEAP model in Bonsai. However, the Bonsai.SLEAP package was built for Tensorflow models, but of course, now the backend is Pytorch, so those models are no longer supported. The most obvious way forward is to convert SLEAP models into general formats so solutions on the Bonsai side can be generalised to other torch models.
So I wanted to suggest whether we could write some new functions to export/convert a model into torchscript and/or ONNX? Aiming for both would give a lot of flexibility in terms of possible ways to incorporate SLEAP models, not limited to Bonsai.
I imagine that it could be accessible both through the CLI and the GUI to make it as easy as possible for users going forwards.
Here's the ongoing discussion of the Bonsai.SLEAP package with @glopesdev.
Looking forward to chatting more about it!
Beta Was this translation helpful? Give feedback.
All reactions