error testing 'ollama-instance-1'. Error from LLM provider 'ollama_chatEception - [errno111] connection refused
I am unable to configure Ollama at localhost:11434.
When I run the curl command or try opening http://localhost:11434/ in the browser, it confirms that Ollama is running.
Despite this, the SDK does not seem to recognize it.
Could you please advise if I might be missing any configuration steps?