-
-
Notifications
You must be signed in to change notification settings - Fork 622
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama: add tools support #1022
base: main
Are you sure you want to change the base?
Conversation
ead7477
to
61eeb2c
Compare
Lovely, can you get the lint issue addressed?
|
78f9357
to
621e71f
Compare
621e71f
to
0591168
Compare
Fixed. By the way the ollamaclient in langchaingo is no more up to date (deprecated options, old single embedding endpoint), I may propose a PR soon to remove the ollama internal package from langchaingo and replace it by: https://pkg.go.dev/github.com/ollama/ollama/api Side note: seems to conflict with at least one other existing example PR: #951 |
Is it ready for use now |
Just bumping this :) Thanks for your contributions. RE: llama3.1 don't properly use message history with tool results and I don't really know why
Curious. What style of tool call orchestration is supported? Can the LLM invoke more than one tool call in a response? Can it dictate that specific tools be called concurrently for any reason, or is everything synchronous? |
In this PR I tried to:
Notes:
except for Parameters field where I set the any type https://github.com/ollama/ollama/blob/main/api/types.go#L168 for easier integration
The model properly answer with json tool call format but after at end it seems that llama3.1 don't properly use message history with tool results and I don't really know why. It would be interesting to test with other models.
May close #965
PR Checklist
memory: add interfaces for X, Y
orutil: add whizzbang helpers
).Fixes #123
).golangci-lint
checks.