Skip to content

Conversation

@kirill670
Copy link

@kirill670 kirill670 commented Nov 10, 2025

Summary

Added tool calling emulation for OpenAI compatible providers.

Type of Change

  • [ x ] Feature
  • [ x ] Bug fix
  • Refactor / Code quality
  • Performance improvement
  • Documentation
  • Tests
  • Security fix
  • Build / Release
  • Other (specify below)

AI Assistance

  • [ x ] This PR was created or reviewed with AI assistance

Testing

Unit tests

Related issues

None

Screenshots/Demos (for UX changes)

Before:
None yet
After:
None yet

Submitting a Recipe?

Email:

Torvald Linus and others added 2 commits November 9, 2025 23:53
…compatible providers

This update introduces a `supports_tool_calling` flag to the OpenAI and OpenAI-compatible providers. This flag determines whether native or emulated tool/function calling is used when interacting with models/APIs. When native tool calling is unavailable or disabled, local tool call emulation ensures agent functionality, improving interoperability and reliability across legacy or limited OpenAI-compatible APIs.

- Adds `supports_tool_calling` bool field to OpenAiProvider and DeclarativeProviderConfig.
- Updates provider constructors and environment/config read logic.
- If flag is false, tool usage is emulated using local handler instead of native API.
- Maintains compatibility for providers that do not natively support tools.

This change improves agent robustness across different OpenAI(-compatible) providers. No breaking changes expected; migration is seamless.
…nai-providers

Add tool calling support flag to OpenAI and compatible providers
@michaelneale
Copy link
Collaborator

michaelneale commented Nov 10, 2025

thanks @kirill670 do you mind doing the DCO signing thing? https://github.com/block/goose/pull/5648/checks?check_run_id=54928685902

(also a quick cargo fmt and ./script/clippy-lint.sh)

Copy link
Collaborator

@DOsinga DOsinga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great, but can you have a look at simplifying the code a bit?

I also do wonder, do we want this primarily on openai? or also on ollama?

let interpreter = OllamaInterpreter::new()?;
message = augment_message_with_tool_calls(&interpreter, message, tools).await?;

Ok((message, ProviderUsage::new(model, usage)))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the if and else branch are identical. why not leave it all as is, but then add at the end, this last bit?

if !self.supports_tool_calling && !tools.is_empty() {
        use super::toolshim::{augment_message_with_tool_calls, OllamaInterpreter};
        let interpreter = OllamaInterpreter::new()?;
        message = augment_message_with_tool_calls(&interpreter, message, tools).await?;
    }

and then instead of not streaming below when you need tool shimming, you can do the same below? maybe also lift out the use statement?

.get_param("OPENAI_SUPPORTS_TOOL_CALLING")
.unwrap_or_else(|_| "true".to_string())
.parse()
.unwrap_or(true);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can do without this; if you want to have an openai provider that does tool emulation, just use the declarative provider

@michaelneale
Copy link
Collaborator

I wonder if we get it working on openai, then we make sure ollama really is just using openai compatibility (not sure how close it is, but worth thinking about so) - but ideally want it on both, but they should be getting closer together.

I am also curious how well people are finding this emulation working

@DOsinga
Copy link
Collaborator

DOsinga commented Nov 10, 2025

it is already pretty close indeed; our ollama provider doesn't have an API key and why should it, but some people run their own ollama version on a remote server with an API key and I've told them to just use the OpenAi compatible custom provider and that seems to work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants