You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have torch model used for nlp, which is converted to onnx. I am using updated bentoml version v1.0.13.
I saw a way to pass tokenizer for transformers but this doesnt work for onnx model.
So is there any way to create pipeline for onnx so that it carried tokenizer data for inference ??
I just noticied in onnx module to specfify external_modules, but any example to define it ??
external_modules (:code:`List[ModuleType]`, `optional`, default to :code:`None`):
user-defined additional python modules to be saved alongside the model or custom objects,
e.g. a tokenizer module, preprocessor module, model configuration module
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I have torch model used for nlp, which is converted to onnx. I am using updated bentoml version v1.0.13.
I saw a way to pass tokenizer for transformers but this doesnt work for onnx model.
So is there any way to create pipeline for onnx so that it carried tokenizer data for inference ??
I just noticied in onnx module to specfify external_modules, but any example to define it ??
Beta Was this translation helpful? Give feedback.
All reactions