Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Azure OpenAI not support o1-preview #4396

Open
Sylar-W opened this issue Oct 17, 2024 · 5 comments
Open

[Bug] Azure OpenAI not support o1-preview #4396

Sylar-W opened this issue Oct 17, 2024 · 5 comments
Labels
azure 🐛 Bug Something isn't working | 缺陷 Inactive No response in 30 days | 超过 30 天未活跃

Comments

@Sylar-W
Copy link

Sylar-W commented Oct 17, 2024

📦 Environment

Docker

📌 Version

1.22.7

💻 Operating System

Windows

🌐 Browser

Chrome, Edge

🐛 Bug Description

When I switch to the Azure OpenAI model, Lobe does not support the O1 preview model

📷 Recurrence Steps

🚦 Expected Behavior

No response

📝 Additional Information

No response

@Sylar-W Sylar-W added the 🐛 Bug Something isn't working | 缺陷 label Oct 17, 2024
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Oct 17, 2024
@lobehubbot
Copy link
Member

👀 @sylar003

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Copy link

dosubot bot commented Oct 17, 2024

Hello, @sylar003! I'm here to help you with any bugs, questions, or contributions. Let's tackle this together!

I found a similar open issue regarding the Azure OpenAI O1 models not working. The issue is due to the use of the max_tokens parameter, which is not supported by the Azure OpenAI O1 models. Instead, you should use max_completion_tokens. To resolve this issue, modify the request construction in the codebase to replace max_tokens with max_completion_tokens for the Azure OpenAI O1 models. This change should be made in the part of the code where the request parameters are set for Azure OpenAI models, specifically in the LobeAzureOpenAI class or wherever the request payload is being constructed [1].

To continue talking to Dosu, mention @dosu.

@dosubot dosubot bot added the azure label Oct 17, 2024
@BrandonStudio
Copy link
Contributor

@arvinxx I think we should mask Azure endpoint in error message

@arvinxx
Copy link
Contributor

arvinxx commented Oct 29, 2024

@BrandonStudio you are right

@lobehubbot lobehubbot added the Inactive No response in 30 days | 超过 30 天未活跃 label Dec 29, 2024
@wolph
Copy link

wolph commented Feb 15, 2025

As a workaround you can use litellm to create a proxy that behaves like OpenAI. That way LobeChat will think it is talking to OpenAI o1/o3 (mini) and it will work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
azure 🐛 Bug Something isn't working | 缺陷 Inactive No response in 30 days | 超过 30 天未活跃
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

5 participants