-
-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SpeziLLMOpenAI: Repalce MacPaw/OpenAI With Generated API Calls #64
base: main
Are you sure you want to change the base?
Conversation
endpoint This enables swift-openapi-generator to generate streamed responses. See: openai/openai-openapi#311
outgoing requests
29be03d
to
d5a4eb3
Compare
API response errors
Swiftlint type_contents_order_violation warning
d5a4eb3
to
3ded304
Compare
645ec81
to
97b12ae
Compare
ab4f137
to
e2b435d
Compare
48f3c0b
to
974449b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for all the work here @paulhdk; very important to improve this setup and build on the OpenAPI specification!
It would be amazing to get a first insight from @philippzagar to get a good round of feedback.
@@ -51,7 +50,7 @@ public struct LLMOpenAIModelParameters: Sendable { | |||
/// - logitBias: Alters specific token's likelihood in completion. | |||
/// - user: Unique identifier for the end-user, aiding in abuse monitoring. | |||
public init( | |||
responseFormat: ChatQuery.ResponseFormat? = nil, | |||
responseFormat: Components.Schemas.CreateChatCompletionRequest.response_formatPayload? = nil, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am wondering if we should add compact type aliases for this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I’ve added an LLMOpenAIRequestType
alias. Does that work for you?
Should we also introduce an alias for Components.Schemas
in general? This won’t make the types shorter, but something like LLMOpenAIGeneratedTypes could improve readability, maybe?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can introduce well defined and named typealias for the specific types that we use in our API surface; we should see if we can make them compact and focus on them.
/// "firstName": [ | ||
/// "type": "string", | ||
/// "description": "The first name of the person") | ||
/// ], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am wondering if we can add a nicely typed type for this instead of a dictionary; it can always map to a dictionary under the hood. Would be cool to avoid loosing that type-safe element?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Previously, SpeziLLMOpenAI wrapped around the Swift types provided by the OpenAI
package, which would then eventually be passed to the API.
With the OpenAI OpenAPI spec, such types aren't generated, but the JSON schemas are instead validated for correctness as they're being encoded in the OpenAPIObjectContainer type.
Introducing such wrapper types again would require precise alignment with the OpenAI, which would make it, I could imagine, harder to maintain over time.
I could imagine that’s one reason why the official OpenAI Python package, which is also generated from the OpenAI OpenAPI specification, does not offer wrapper types either, AFAICT.
What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think adding an extension initializer/function that takes the well-typed arguments of one wants to use them would be beneficial and would avoid issues with string keys that are not correct or malformatted. Still allowing to pass in a dictionary might be an escape hatch that we can still provide. The OpenAPI surface is quite stable and if we use e.g. an enum for the type of the parameter can also have an other
case with an associated string value.
Sources/SpeziLLMOpenAI/FunctionCalling/LLMFunctionParameterSchemaCollector.swift
Outdated
Show resolved
Hide resolved
bdd7127
to
1f52a7f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for continuing to work on this @paulhdk!
I had a quick sync with @philippzagar and he will take a closer look at the PR to provide insights on the different changes; would be great to update the PR to the latest version of main to resolve the conflicts; I think after the feedback from @philippzagar we should be ready to get this merged 🚀
.Input(body: .json(LLMOpenAIRequestType( | ||
messages: openAIContext, | ||
model: schema.parameters.modelType, | ||
frequency_penalty: schema.modelParameters.frequencyPenalty, | ||
logit_bias: schema.modelParameters.logitBias.additionalProperties.isEmpty ? nil : schema | ||
.modelParameters | ||
.logitBias, | ||
max_tokens: schema.modelParameters.maxOutputLength, | ||
n: schema.modelParameters.completionsPerOutput, | ||
presence_penalty: schema.modelParameters.presencePenalty, | ||
response_format: schema.modelParameters.responseFormat, | ||
seed: schema.modelParameters.seed, | ||
stop: LLMOpenAIRequestType.stopPayload.case2(schema.modelParameters.stopSequence), | ||
stream: true, | ||
temperature: schema.modelParameters.temperature, | ||
top_p: schema.modelParameters.topP, | ||
tools: functions.isEmpty ? nil : functions, | ||
user: schema.modelParameters.user | ||
))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might be nice to format this similar to our other code bases; might be applicable to other parts as well:
.Input(body: .json(LLMOpenAIRequestType( | |
messages: openAIContext, | |
model: schema.parameters.modelType, | |
frequency_penalty: schema.modelParameters.frequencyPenalty, | |
logit_bias: schema.modelParameters.logitBias.additionalProperties.isEmpty ? nil : schema | |
.modelParameters | |
.logitBias, | |
max_tokens: schema.modelParameters.maxOutputLength, | |
n: schema.modelParameters.completionsPerOutput, | |
presence_penalty: schema.modelParameters.presencePenalty, | |
response_format: schema.modelParameters.responseFormat, | |
seed: schema.modelParameters.seed, | |
stop: LLMOpenAIRequestType.stopPayload.case2(schema.modelParameters.stopSequence), | |
stream: true, | |
temperature: schema.modelParameters.temperature, | |
top_p: schema.modelParameters.topP, | |
tools: functions.isEmpty ? nil : functions, | |
user: schema.modelParameters.user | |
))) | |
.Input(body: | |
.json( | |
LLMOpenAIRequestType( | |
messages: openAIContext, | |
model: schema.parameters.modelType, | |
frequency_penalty: schema.modelParameters.frequencyPenalty, | |
logit_bias: schema.modelParameters.logitBias.additionalProperties.isEmpty ? nil : schema | |
.modelParameters | |
.logitBias, | |
max_tokens: schema.modelParameters.maxOutputLength, | |
n: schema.modelParameters.completionsPerOutput, | |
presence_penalty: schema.modelParameters.presencePenalty, | |
response_format: schema.modelParameters.responseFormat, | |
seed: schema.modelParameters.seed, | |
stop: LLMOpenAIRequestType.stopPayload.case2(schema.modelParameters.stopSequence), | |
stream: true, | |
temperature: schema.modelParameters.temperature, | |
top_p: schema.modelParameters.topP, | |
tools: functions.isEmpty ? nil : functions, | |
user: schema.modelParameters.user | |
) | |
) | |
) |
SpeziLLMOpenAI: Repalce MacPaw/OpenAI With Generated API Calls
♻️ Current situation & Problem
This PR replaces the MacPaw/OpenAI package with generated API calls by the swift-openapi-generator package.
Calls are generated from OpenAI's official OpenAPI spec.
As discussed with @PSchmiedmayer, this marks the first step in adding the ability to send local image content to the OpenAI API.
This PR does not add any new features but simply replicates the existing feature set with the generated API calls.
I've tried my best to keep track of any known issues in-code with FIXMEs as well as in the following list.
Current Issues
Sources/SpeziLLMOpenAI/LLMOpenAISession+Generation.swift
does not handle the "[DONE]" message sent by the API to conclude a stream. There is currently a hacky workaround that catches the error that is thrown in that case. I'm not quite sure yet how to handle that case elegantly.LLMFunctionParameterItemSchema
type does not use a generated type yet.SpeziLLMOpenAI/FunctionCalling
should be, if possible, refactored, as they currently have a lot of optional bindings.openapi-generator-swift
expects and anopenapi.yaml
and a configuration file in the TestApp, which is why there are duplicate openapi specs and configuration files in this PR. I'm not quite sure why it's expecting them in the TestApp, but I suspect it has something to do with the generated types being used in the TestApp's model selection mechanism.⚙️ Release Notes
📚 Documentation
As no new functionality is added, nothing should change here (unless I missed something).
✅ Testing
This PR passes the existing tests. Since no new functionality has been added, I believe this should suffice.
📝 Code of Conduct & Contributing Guidelines
By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines: