-
Notifications
You must be signed in to change notification settings - Fork 122
Open
Description
First off, amazing job with this 👏🏻 just came across this and wanted to test this out for use with a local ollama instance but couldn't find much documentation on how to get it to work.
Error:
- i believe it should work without an api key
- or there should be fields to supply one?
Tested with:
-
followed docs here: https://gethelp.tiledesk.com/articles/ai-prompt-action-multi-llm/#using-ollama-with-action-promptai
-
using the
docker-compose.yml
for tiledesk-server -
i've tested with:
http://localhost:11434
http://host.docker.internal:11434
http://localmachine:11434
I've tried exposing ollama through host.docker.internal as an extra_host but to no avail.
- in design studio, im using the AI Prompt block with {{lastUserText}} as the prompt.
- when previewing, i keep getting:
Answer preview ( generated with Ollama ) Oops! Something went wrong. Please retry in a few moment.
If anyone has experience with this, any help with this is greatly appreciated 🙏🏻
faustocv
Metadata
Metadata
Assignees
Labels
No labels