Replies: 4 comments 3 replies
-
|
Hi @Nisim1! I'm here to help you with any bugs or questions you have. Let's squash those bugs together! To resolve the "Embedding model client not properly initialized" error when using HuggingFace Embeddings Inference in Docker, you can follow these steps:
These steps should help in diagnosing and resolving the issue with the HuggingFace Embeddings Inference component in Docker [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
|
hey, what's you langflow version and the browser and os you are using? |
Beta Was this translation helpful? Give feedback.
-
|
same issue @Nisim1 did you solve it? |
Beta Was this translation helpful? Give feedback.
-
✅ Problem:You're trying to use HuggingFace Embeddings Inference inside LangFlow running in Docker, and you're seeing this error:
Even though the model works fine when tested via Postman, LangFlow throws this error. So what gives? 🛠️ Solution:The issue is with how the You’ll need to manually edit the component code as follows: 1. Change the import statementReplace: from langchain_community.embeddings.huggingface import HuggingFaceInferenceAPIEmbeddingsWith: from langchain_community.embeddings import HuggingFaceInferenceAPIEmbeddings2. Update the
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi community!
I am trying to use HuggingFace Embeddings Inference.
My goal is to use open-source embedding models to perform text embedding.
I am getting an error that says:
"Embedding model client not properly initialized."
When I try to run the same model via Postman, it works!
How can this be resolved?
(The langFlow it's on Docker if it's matter)
Thanks in Advance!
Beta Was this translation helpful? Give feedback.
All reactions