-
Notifications
You must be signed in to change notification settings - Fork 566
Open
Description
Hello, I'm having trouble using the gpt-4-o-mini model with LangChain or OpenAI alongside GPTCache.
Any idea how to solve it? Also, could you please share your OpenAI and GPTCache versions?
Here is my current setup:
from langchain.llms import OpenAI
from gptcache.adapter.langchain_models import LangChainLLMs
llm = LangChainLLMs(llm=OpenAI(model="gpt-4-o-mini", temperature=0))When I run my QA chain, I get this error:
"This is a chat model and not supported in the v1/completions endpoint"
I tried upgrading my OpenAI version to 1.51.2 and then encountered this error:
"module 'openai' has no attribute 'api_base'. Did you mean: 'api_type'?"
Any help would be greatly appreciated! Thanks in advance.
Metadata
Metadata
Assignees
Labels
No labels