-
-
Notifications
You must be signed in to change notification settings - Fork 607
[Bug]: CreateStreamedResponse fails with custom models that return sources without choices #545
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I think thats an important thing we have to think about. Half the issues on the board are compatibility with a different platform, yet this is the OpenAI library. I don't disagree with you, but all the typing becomes a bit moot if we are just skipping, null coalescing, etc our way through the minor differences on the other platforms. We'd have to make things nullable or missing and maybe thats not a bad thing, because I do see the benefit as all these other models have parity with OpenAI. Yet I don't want downstream users to have their linting tools go mad supporting all the null/changing aspects when it may not be relevant for OpenAI. Yet when I think about a drop in replacement MinIO/Wasabi for Amazon S3 - it just works. Is it up to the Amazon PHP library to support alternatives that happen to have minor incompatibility with S3? Probably not. I'd be curious if @nunomaduro / @gehrisandro have taken a stance one way or the other in terms of how they expect this library to adjust. |
I appreciate the thoughtful response. I'd like to point out that the library already supports alternative backends through the $client = OpenAI::factory()
->withApiKey($config['api_key'])
->withBaseUri($config['base_uri'])
->withHttpClient($httpClient)
->withStreamHandler(fn (RequestInterface $request): ResponseInterface => $httpClient->send($request, [
'stream' => true,
]))
->make(); This suggests the library was designed with some level of cross-platform compatibility in mind. Users are already encouraged to use different base URIs for alternative services that implement OpenAI-compatible APIs. Given this existing flexibility, my suggestion for a compatibility mode aligns well with the library's architecture: $client = OpenAI::factory()
->withApiKey($config['api_key'])
->withBaseUri($config['base_uri'])
->withCompatibilityMode('extended') // New method to handle response variations
->make(); This approach would:
By adding this option, you'd be completing the compatibility story that was already started with the base URI configuration, making the library truly useful for the growing ecosystem of OpenAI-compatible services while keeping the default behavior strict for pure OpenAI users. |
I see. I wonder how an opt-in compatibility change could dynamically affect typing. I believe historically the baseUrl feature was because OpenAI was available through Azure or directly from OpenAI - so you needed the ability to toggle between them. Of course that opened the gate for everything you see here. |
After digging into this issue with Claude. I need a raw payload sample to confirm a fix. When I investigated this issue on Claude, it was a new property
So reducing the support on |
For context. Looking for something like this.
|
Sorry for the late reply, Because I used a regular HTTP library instead.
|
Description
Bug: CreateStreamedResponse fails with custom models that return sources without choices
Issue Description
When using custom models through Open WebUI that return sources in the first response without a 'choices' array, the
CreateStreamedResponse::from()
method fails because it can't find the expected 'choices' key in the response attributes.Steps To Reproduce
Current Behavior
The first response from a custom model using sources returns something like:
This causes the following code in
CreateStreamedResponse::from()
to fail:The error occurs because
$attributes['choices']
doesn't exist in the response, causing the stream to fail instead of gracefully handling or skipping this response chunk.Expected Behavior
The client should be able to handle response chunks that don't contain the 'choices' field by either:
This would allow the client to work with custom models that return sources information, particularly in the first chunk of a streamed response.
Environment
OpenAI PHP Client Version
v0.10.3
PHP Version
8.3.16
Notes
This issue specifically affects custom models in Open WebUI that return sources information. The standard OpenAI API doesn't have this issue, but as more people use the client with alternative backends, handling non-standard response formats becomes important for compatibility.
The text was updated successfully, but these errors were encountered: