Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Danswer fails to respond. Search & Chat Tab both are broken ("/usr/local/lib/python3.11/site-packages/litellm/llms/ollama.py", line 380, in ollama_completion_stream raise OllamaError( litellm.llms.ollama.OllamaError: b'') #3430

Closed
MyCodeBits opened this issue Dec 11, 2024 · 2 comments
Labels

Comments

@MyCodeBits
Copy link

MyCodeBits commented Dec 11, 2024

Initial Setup details:

  • Running on Mac OS
  • Main branch commit I am on is the latest as of Dec 11th'24: recognize updates #3397
  • Running local llama3.1 model using ollama to which Danswer is connected:
    ollama run llama3.1

Issues:

Following the quickstart doc to run, below are the issues I am witnessing:

  1. Search Tab:
Screenshot 2024-12-11 at 11 18 34 AM

Issue: AI Answer spins, is able to search some docs, but finally bails out without response.

Logs from Container danswer-stack-api_server-1 :

Error logs - Container - danswer-stack-api_server-1.md

Last Stack Trace seen in above log file:

2024-12-11 11:46:19 ERROR:    12/11/2024 07:46:19 PM               query_backend.py  283: Error in search answer streaming
2024-12-11 11:46:19 Traceback (most recent call last):
2024-12-11 11:46:19   File "/app/danswer/server/query_and_chat/query_backend.py", line 275, in stream_generator
2024-12-11 11:46:19     for packet in stream_search_answer(
2024-12-11 11:46:19   File "/app/danswer/utils/timing.py", line 70, in wrapped_func
2024-12-11 11:46:19     value = next(gen)
2024-12-11 11:46:19             ^^^^^^^^^
2024-12-11 11:46:19   File "/app/danswer/one_shot_answer/answer_question.py", line 368, in stream_search_answer
2024-12-11 11:46:19     for obj in objects:
2024-12-11 11:46:19   File "/app/danswer/one_shot_answer/answer_question.py", line 267, in stream_answer_objects
2024-12-11 11:46:19     for packet in cast(AnswerObjectIterator, answer.processed_streamed_output):
2024-12-11 11:46:19   File "/app/danswer/llm/answering/answer.py", line 281, in processed_streamed_output
2024-12-11 11:46:19     for processed_packet in self._get_response([llm_call]):
2024-12-11 11:46:19   File "/app/danswer/llm/answering/answer.py", line 186, in _get_response
2024-12-11 11:46:19     yield from self._handle_specified_tool_call(llm_calls, tool, tool_args)
2024-12-11 11:46:19   File "/app/danswer/llm/answering/answer.py", line 164, in _handle_specified_tool_call
2024-12-11 11:46:19     yield from self._get_response(llm_calls + [new_llm_call])
2024-12-11 11:46:19   File "/app/danswer/llm/answering/answer.py", line 245, in _get_response
2024-12-11 11:46:19     yield from response_handler_manager.handle_llm_response(stream)
2024-12-11 11:46:19   File "/app/danswer/llm/answering/llm_response_handler.py", line 69, in handle_llm_response
2024-12-11 11:46:19     for message in stream:
2024-12-11 11:46:19   File "/app/danswer/llm/chat_llm.py", line 459, in _stream_implementation
2024-12-11 11:46:19     for part in response:
2024-12-11 11:46:19   File "/usr/local/lib/python3.11/site-packages/litellm/llms/ollama.py", line 427, in ollama_completion_stream
2024-12-11 11:46:19     raise e
2024-12-11 11:46:19   File "/usr/local/lib/python3.11/site-packages/litellm/llms/ollama.py", line 380, in ollama_completion_stream
2024-12-11 11:46:19     raise OllamaError(
2024-12-11 11:46:19 litellm.llms.ollama.OllamaError: b''
  1. Chat Tab:
Screenshot 2024-12-11 at 11 22 12 AM

Clicking on Summarize the Doc:

Screenshot 2024-12-11 at 11 22 46 AM

The following error is seen:

Screenshot 2024-12-11 at 11 28 57 AM

Stack Trace:

Traceback (most recent call last):
  File "/app/danswer/chat/process_message.py", line 644, in stream_chat_message_objects
    for packet in answer.processed_streamed_output:
  File "/app/danswer/llm/answering/answer.py", line 281, in processed_streamed_output
    for processed_packet in self._get_response([llm_call]):
  File "/app/danswer/llm/answering/answer.py", line 245, in _get_response
    yield from response_handler_manager.handle_llm_response(stream)
  File "/app/danswer/llm/answering/llm_response_handler.py", line 69, in handle_llm_response
    for message in stream:
  File "/app/danswer/llm/chat_llm.py", line 459, in _stream_implementation
    for part in response:
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/ollama.py", line 427, in ollama_completion_stream
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/ollama.py", line 380, in ollama_completion_stream
    raise OllamaError(
litellm.llms.ollama.OllamaError: b''

Any inputs on how can i go pass the issue ?

Copy link

This issue is stale because it has been open 75 days with no activity. Remove stale label or comment or this will be closed in 15 days.

@github-actions github-actions bot added the Stale label Feb 25, 2025
Copy link

github-actions bot commented Mar 5, 2025

This issue was closed because it has been stalled for 90 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Mar 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant