Replies: 2 comments
-
I have this problem right now after I updated LangChain. Earlier was just fine. |
Beta Was this translation helpful? Give feedback.
-
I ran your code, Now it seems they have a bug in the HuggingFaceHub lib ---> 49 result = chain({"question": query, "chat_history": history}) 24 frames AttributeError: 'InferenceClient' object has no attribute 'post' CAN SOMEONE ASSIST TO SOLVE THIS ?? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I was trying to build a RAG LLM model using opensource models. but while generating the response the llm is attaching the entire prompt and relevant document at the output. can anyone please tell me how can I remove the prompt and the Question section and get only the Answer in response ?
langchain Version: 0.1.9
Code:
Output:
I have also tried with mistralai/Mistral-7B-Instruct-v0.2 , NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO and mistralai/Mixtral-8x7B-Instruct-v0.1 . but got same kind of result.
Can anyone solve this issue ?
Beta Was this translation helpful? Give feedback.
All reactions