Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: [OpenAI] (PoC) Generated model classes #266

Open
wants to merge 16 commits into
base: main
Choose a base branch
from

Conversation

newtork
Copy link
Contributor

@newtork newtork commented Jan 9, 2025

Context

https://github.tools.sap/AI/ai-sdk-java-backlog/issues/129

Generated code for OpenAI specification 2024-10-02

Feature scope:

  • add generated code
    • provide workarounds
    • add PRs in Cloud SDK
  • update unit- and integration-tests
    • update controller class in sample application

Definition of Done

  • Functionality scope stated & covered
  • Tests cover the scope above
  • Error handling created / updated & covered by the tests above
  • Aligned changes with the JavaScript SDK
  • Documentation updated
  • Release notes updated

a-d added 4 commits January 8, 2025 09:47
# Conflicts:
#	foundation-models/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClient.java
@newtork newtork changed the title Generated OpenAI Generated OpenAI model classes Jan 9, 2025
@newtork newtork marked this pull request as draft January 9, 2025 10:19
bot-sdk-js and others added 8 commits January 9, 2025 10:19
…-openai

# Conflicts:
#	foundation-models/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiChatCompletionDelta.java
#	foundation-models/openai/src/test/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClientTest.java
Comment on lines 328 to 340
return new ClientStreamingHandler<>(deltaType, OpenAiError.class, OpenAiClientException::new)
.objectMapper(
JACKSON
.addMixIn(
CreateChatCompletionResponse.class,
JacksonMixins.CreateChatCompletionResponseMixIn.class)
.addMixIn(
CreateChatCompletionStreamResponse.class,
JacksonMixins.CreateChatCompletionStreamResponseMixIn.class)
.addMixIn(
ChatCompletionsCreate200Response.class,
JacksonMixins.ChatCompletionCreate200ResponseMixIn.class))
.handleStreamingResponse(client.executeOpen(null, request, null));
Copy link
Contributor Author

@newtork newtork Jan 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Comment)

Workaround because Jackson is unable to determine deserializer by itself for

  • ChatCompletionsCreate200Response
    • CreateChatCompletionStreamResponse extends *
    • CreateChatCompletionResponse extends *

Copy link
Member

@rpanackal rpanackal Jan 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Question)

Maybe I missed the problem. But, I don't see the reason why deduction would not work as there is an additional property prompt_filter_results in CreateChatCompletionResponse that is not present in the other.

According to my understanding, as long as there is a distinguishing property, Jackson should be able the resolve the subtype.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't see the reason why deduction would not work

The actual payload for streamed responses differs significantly from the specification; it is inconsistent. Whatever annotation / setting we choose, it will be a compromise. If you find deduction working, or if you prefer a different mixin - please be my guest and change it as part of the upcoming convenience API draft.

Comment on lines 132 to 152
@Nonnull
public OpenAiChatCompletionOutput chatCompletion(@Nonnull final String prompt)
public CreateChatCompletionResponse chatCompletion(@Nonnull final String prompt)
throws OpenAiClientException {
final OpenAiChatCompletionParameters parameters = new OpenAiChatCompletionParameters();
final CreateChatCompletionRequest parameters = new CreateChatCompletionRequest();

if (systemPrompt != null) {
parameters.addMessages(new OpenAiChatSystemMessage().setContent(systemPrompt));
parameters.addMessagesItem(
new ChatCompletionRequestSystemMessage()
.role(ChatCompletionRequestSystemMessage.RoleEnum.SYSTEM)
.content(ChatCompletionRequestSystemMessageContent.create(systemPrompt)));
}
parameters.addMessages(new OpenAiChatUserMessage().addText(prompt));
parameters
.addMessagesItem(
new ChatCompletionRequestUserMessage()
.role(ChatCompletionRequestUserMessage.RoleEnum.USER)
.content(ChatCompletionRequestUserMessageContent.create(prompt)))
.functions(null)
.tools(null)
.parallelToolCalls(null);
return chatCompletion(parameters);
}
Copy link
Contributor Author

@newtork newtork Jan 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Comment)

This is the unfortunate diff expected for regular usage.
Without convenience API.

Comment on lines 280 to 319
"messages" : [ {
"role" : "system",
"content" : "You are a helpful AI"
}, {
"role" : "user",
"content" : [ {
"type" : "text",
"text" : "Hello World! Why is this phrase so famous?"
} ]
} ]
"temperature" : 1,
"top_p" : 1,
"stream" : false,
"presence_penalty" : 0,
"frequency_penalty" : 0,
"messages" : [ {
"content" : "You are a helpful AI",
"role" : "system"
}, {
"content" : "Hello World! Why is this phrase so famous?",
"role" : "user"
} ],
"logprobs" : false,
"n" : 1,
"parallel_tool_calls" : true,
"tools" : [ ],
"functions" : [ ]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Comment)

Diff in payload being sent.
May require further finetuning to stop default values and/or empty arrays.

Comment on lines -376 to +426
.isEqualTo(new float[] {0.0f, 3.4028235E38f, 1.4E-45f, 1.23f, -4.56f});
.containsExactly(
new BigDecimal("0.0"),
new BigDecimal("3.4028235E+38"),
new BigDecimal("1.4E-45"),
new BigDecimal("1.23"),
new BigDecimal("-4.56"));
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Major/Comment)

Instead of float[] we have List<BigDecimal> for embedding vectors. This needs finetuning.

Comment on lines -456 to -460
OpenAiChatCompletionOutput totalOutput = new OpenAiChatCompletionOutput();
final List<OpenAiChatCompletionDelta> deltaList =
stream.peek(totalOutput::addDelta).toList();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Major/Comment)

I found the totalOutput handling confusing. I did not write any basic convenience for that, instead I made sure the information can be extracted. With casting and/or custom-field lookups. This needs refinement.

Comment on lines 89 to 113
@Nonnull
public OpenAiChatCompletionOutput chatCompletionImage(@Nonnull final String linkToImage) {
public CreateChatCompletionResponse chatCompletionImage(@Nonnull final String linkToImage) {
final var partText =
new ChatCompletionRequestMessageContentPartText()
.type(TEXT)
.text("Describe the following image.");
final var partImageUrl =
new ChatCompletionRequestMessageContentPartImageImageUrl()
.url(URI.create(linkToImage))
.detail(HIGH);
final var partImage =
new ChatCompletionRequestMessageContentPartImage().type(IMAGE_URL).imageUrl(partImageUrl);
final var userMessage =
new ChatCompletionRequestUserMessage()
.role(USER)
.content(ChatCompletionRequestUserMessageContent.create(List.of(partText, partImage)));
final var request =
new OpenAiChatCompletionParameters()
.addMessages(
new OpenAiChatMessage.OpenAiChatUserMessage()
.addText("Describe the following image.")
.addImage(
linkToImage,
OpenAiChatMessage.OpenAiChatUserMessage.ImageDetailLevel.HIGH));
new CreateChatCompletionRequest()
.addMessagesItem(userMessage)
.functions(null)
.tools(null)
.parallelToolCalls(null);

return OpenAiClient.forModel(GPT_4O).chatCompletion(request);
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Comment)

Another showcase for OpenAI generated code consumption without convenience.

@newtork newtork marked this pull request as ready for review January 13, 2025 09:58
@newtork newtork changed the title Generated OpenAI model classes chore: [openai] (PoC) Generated model classes, Jan 13, 2025
@newtork newtork changed the title chore: [openai] (PoC) Generated model classes, chore: [openai] (PoC) Generated model classes Jan 13, 2025
@newtork newtork added the please-review Request to review a pull-request label Jan 13, 2025
a-d and others added 3 commits January 13, 2025 11:39
# Conflicts:
#	foundation-models/openai/src/test/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClientTest.java
@CharlesDuboisSAP CharlesDuboisSAP removed the please-review Request to review a pull-request label Jan 16, 2025
@newtork newtork changed the title chore: [openai] (PoC) Generated model classes chore: [OpenAI] (PoC) Generated model classes Jan 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants