-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Llama3.2-1B .task
export does not work in mediapipe
#411
Comments
Hi @mikel-brostrom, can you ensure you followed the steps here? https://ai.google.dev/edge/mediapipe/solutions/genai/llm_inference#pytorch-models . Let me know if that works for you. |
I have the same error log. I’ve already used the task bundler to convert the TFLite model to a
|
I updated my comment with the full script I am running @pkgoogle, @chienhuikuo. It clarifies my export workflow and hopefully makes this reproducible |
.task
export does not work in mediapipe
I haven't been able to reproduce yet. The error message seems to indicate that the resulting TF Lite file (which is wrapped in the .task file) does not have a signature called |
My understanding is that the signature stuff happens here:
|
Hey mikel, i am trying to run the script you mentioned here in google colab. I am getting error "File "/content/verify.py", line 23, in |
Added the pip installs I am running to the original comment @krishna1870, @talumbau |
Description of the bug:
I exported my Llama3.2-BB to
.task
using:When loading the
.task
model into into themediapipe
app I get:Am I missing something?
Actual vs expected behavior:
No response
Any other information you'd like to share?
No response
The text was updated successfully, but these errors were encountered: