Skip to content

Python: Bug: Cannot use GPT-O1 Reasoning Model as Azure Chat Completion LLM Service to the kernel for agentic system #12738

Closed
@trinaymithra

Description

@trinaymithra

I am trying to attach GPT-O1 Reasoning model as Azure Chat Completion LLM Service to the kernel and use it in a 'ChatCompletionAgent'. But I'm getting the following error -

kernel_v1 = Kernel()

# add LLM service to the kernel 
# GPT O1 Secrets: Agentic model secretcs
chat_service = AzureChatCompletion(
        service_id= "azure-openai",
        deployment_name=agentic_model_secrets["openai_gpt_deployment_name"],
        api_key = agentic_model_secrets["openai_api_key"],  
        api_version = agentic_model_secrets["openai_api_version"],
        endpoint = agentic_model_secrets["openai_api_base"]
)

comparison_agent = ChatCompletionAgent(
    service=chat_service,
    kernel= kernel_v1,
    name="Chat_Agent",
    instructions= """
        Answer to user questions appropriately.
    """,
    plugins=[chat_agent_plugin]
    )

ServiceResponseException: ("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", BadRequestError('Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}'))

Please suggest how I can use GPT O1 Reasoning model as a service to my Chat Completion Agent.

Metadata

Metadata

Assignees

Labels

pythonPull requests for the Python Semantic Kernel

Type

No type

Projects

Status

Bug

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions