Skip to content

[BUG]: RESPONSE_FORMAT_JSON_SCHEMA and strictJsonSchema settings not applied properly #15

@ryahiaoui

Description

@ryahiaoui

Description

tested the liberty-car-booking example. The RAG functionality and tools work correctly. However, when integrating a new service using a JSON Schema structure and the configuration shown below, I noticed that the AI service did not respect the supportedCapabilities and strictJsonSchema settings.

In the request sent to the LLM, instead of properly adding the JSON Schema under the "response_format" key in the root of the payload, the plugin simply adds a user message instructing the model to return a JSON schema response. The expected format should be something like:

{
  "messages": [
    {
      "role": "user",
      "content": "*********"
    }
  ],
  "temperature": 0.7,
  "model": "gpt-4o-mini",
  "response_format": {
       "json_schema": {
         ...
        }
   }
......
}

Configuration Used

smallrye.llm.plugin.chat-model.class=dev.langchain4j.model.azure.AzureOpenAiChatModel
smallrye.llm.plugin.chat-model.config.api-key=${azure.openai.api.key}
smallrye.llm.plugin.chat-model.config.endpoint=${azure.openai.endpoint}
smallrye.llm.plugin.chat-model.config.service-version=2024-08-01-preview
smallrye.llm.plugin.chat-model.config.deployment-name=${azure.openai.deployment.name}
smallrye.llm.plugin.chat-model.config.temperature=0.1
smallrye.llm.plugin.chat-model.config.topP=0.1
smallrye.llm.plugin.chat-model.config.timeout=120s
smallrye.llm.plugin.chat-model.config.max-retries=2
smallrye.llm.plugin.chat-model.config.logRequestsAndResponses=true
smallrye.llm.plugin.chat-model.config.listeners=@all
smallrye.llm.plugin.chat-model.config.supportedCapabilities=dev.langchain4j.model.chat.Capability.RESPONSE_FORMAT_JSON_SCHEMA
smallrye.llm.plugin.chat-model.config.strictJsonSchema=true

Observed Behavior

It appears that the following configuration values are not being applied properly:

smallrye.llm.plugin.chat-model.config.supportedCapabilities=dev.langchain4j.model.chat.Capability.RESPONSE_FORMAT_JSON_SCHEMA
smallrye.llm.plugin.chat-model.config.strictJsonSchema=true

Despite being configured, the request does not include the proper "response_format" block.

Log Trace

[14/06/2025 02:12:00:307 CEST] 00000064 io.smallrye.llm.plugin.CommonLLMPluginCreator I Attempt to feed : supportedCapabilities (supportedCapabilities) with : dev.langchain4j.model.chat.Capability.RESPONSE_FORMAT_JSON_SCHEM

Expected Behavior

The plugin should serialize the capabilities and schema properly into the request payload sent to the LLM, including "response_format": { "json_schema": { ... } } as part of the top-level JSON.

Additional Info: Direct LangChain4j Usage Works

When using LangChain4j directly, the expected behavior works perfectly, as shown in the example below:

ChatModel chatmodel = AzureOpenAiChatModel.builder()
    .endpoint("****")
    .apiKey("*****")
    .deploymentName("gpt-4o-mini")
    .supportedCapabilities(Set.of(RESPONSE_FORMAT_JSON_SCHEMA))
    .strictJsonSchema(true)
    .serviceVersion("2024-08-01-preview")
    .logRequestsAndResponses(true)
    .build();

jsonSchemaAIService assistant = AiServices.create(jsonSchemaAIService.class, chatmodel);
MyObject answer = assistant.chat(question);

This confirms that the issue lies in the SmallRye plugin integration layer, not in the model or LangChain4j itself.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions