Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

runTools function isn't running tools in parallel #1131

Open
1 task done
danfhernandez opened this issue Oct 14, 2024 · 2 comments
Open
1 task done

runTools function isn't running tools in parallel #1131

danfhernandez opened this issue Oct 14, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@danfhernandez
Copy link

Confirm this is a Node library issue and not an underlying OpenAI API issue

  • This is an issue with the Node library

Describe the bug

I'm attempting to have tools run in parallel using runTools and it doesn't work

To Reproduce

  1. Create a completion with multiple tools
  2. Call runTools with a prompt that will likely invoke multiple tool calls
  3. Wait for stream with it clearly being the case that the functions don't run in parallel

Code snippets

No response

OS

Mac

Node version

Latest

Library version

Latest

@atesgoral
Copy link

Came here to open an issue and found this. +1 on this. Here's a concrete example from my test app:

Setup

I'm omitting the parallel_tool_calls parameter or setting it to true, which is the same behaviour since true is the default. I have a prompt and tool descriptions that allow the LLM to emit multiple tool calls in one completion. My concrete tool functions are async functions that can be run in parallel.

Output from test application

The [@...] tags are the time elapsed since user input:

User> Do I have a BFCM discount and a plush pillow product?

[@1831ms] [function_call] {"name":"lookup_resource","arguments":"{\"resource_type\": \"DISCOUNT\", \"query\": \"BFCM\"}"}

[@1831ms] [function_call] {"name":"lookup_resource","arguments":"{\"resource_type\": \"PRODUCT\", \"query\": \"plush pillow\"}"}

[@2833ms] [function_call_result] "https://example.com/api/discounts/1"

[@3837ms] [function_call_result] "https://example.com/api/products/1"

Assistant> Yes, you have a BFCM discount and a plush pillow product in your store. If you need any help managing them, just let me know!

The tool above is a mock that sleeps 1000ms and returns a baked answer. From the output above we can see:

  1. The function calls arrive at the 1831ms mark
  2. The functions are invoked in sequence, with their promises awaited one by one

Expectation: The functions are invoked at the same time, and their promise resolution/rejections are awaited as a bundle.

Maybe this can be a mode toggle because in some systems it could still be desirable to run tools exclusively, in sequence.

@jacobzim-stl
Copy link
Contributor

jacobzim-stl commented Oct 29, 2024

Thanks for this in depth report, I agree this is important. We'll do some investigating!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants