Skip to content

[Bug]: Support for GPT4o-mini or gpt4- models #657

@oussamaJmaaa

Description

@oussamaJmaaa

Hello, I'm having trouble using the gpt-4-o-mini model with LangChain or OpenAI alongside GPTCache.

Any idea how to solve it? Also, could you please share your OpenAI and GPTCache versions?

Here is my current setup:

from langchain.llms import OpenAI
from gptcache.adapter.langchain_models import LangChainLLMs

llm = LangChainLLMs(llm=OpenAI(model="gpt-4-o-mini", temperature=0))

When I run my QA chain, I get this error:
"This is a chat model and not supported in the v1/completions endpoint"

I tried upgrading my OpenAI version to 1.51.2 and then encountered this error:
"module 'openai' has no attribute 'api_base'. Did you mean: 'api_type'?"

Any help would be greatly appreciated! Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions