Skip to content
Discussion options

You must be logged in to vote

Hi @bryanhpchiang -- I'm not sure if folks have tried converting JAX models to ONNX, but I can ask. The current best practice for inference is to use jax2tf and existing infrastructure for tf.SavedModels. (We should probably add a short guide about that on our documentation)

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@bhchiang
Comment options

Answer selected by bhchiang
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants