-
shoddy asf fixit! i expect this to work with local ollama out of the box again fixit! gptme --model local/crewai-llama2-uncensored "hello"WARNING:gptme.models:Unknown model crewai-llama2-uncensored, using fallback metadata [23:43:58] Using logdir ~/.local/share/gptme/logs/2024-10-08-walking-pink-bird Using workspace at ~/Documents/Scripts/AI/gptme-ai Using project configuration at ~/Documents/Scripts/AI/gptme-ai/gptme.toml Skipped 2 hidden system messages, show with --show-hidden --- ^^^ past messages ^^^ --- User: hello INFO:openai._base_client:Retrying request to /chat/completions in 0.409328 seconds INFO:openai._base_client:Retrying request to /chat/completions in 0.802017 seconds Traceback (most recent call last): File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions yield File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpx/_transports/default.py", line 236, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request raise exc from None File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request response = connection.handle_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request raise exc File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request stream = self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 122, in _connect stream = self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 205, in connect_tcp with map_exceptions(exc_map): File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__ self.gen.throw(value) File "/home/jay/.local/share/pipx/venvs/gptme/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [Errno 111] Connection refusedThe above exception was the direct cause of the following exception: Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Looks like litellm has changed their default port, see #175 (comment) |
Beta Was this translation helpful? Give feedback.
Looks like litellm has changed their default port, see #175 (comment)