Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Anthropic Authentication Error #4651

Open
1 task done
apeixinho opened this issue Oct 31, 2024 · 4 comments
Open
1 task done

[Bug]: Anthropic Authentication Error #4651

apeixinho opened this issue Oct 31, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@apeixinho
Copy link

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

1 - Follow installations steps ...
2 - Get error msg SANDBOX_USER_ID not found
...
3- On windows the following was used for running the docker image:

winpty docker run -it --rm -e SANDBOX_USER_ID=$(id -u)  -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.11-nikolaik -v //var//run//docker.sock://var//run//docker.sock -p 3000:3000 --add-host host.docker.internal:host-gateway --name openhands-app docker.all-hands.dev/all-hands-ai/openhands:0.11

4 - check Anthropic and create a new API key
5 - test with new API key
6 - :( same result ... authentication error

OpenHands Installation

Docker command in README

OpenHands Version

0.11

Operating System

WSL on Windows

Logs, Errors, Screenshots, and Additional Context

File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1715, in completion
   response = anthropic_chat_completions.completion(
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 567, in completion
   raise AnthropicError(
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
 File "/app/openhands/controller/agent_controller.py", line 156, in start_step_loop
   await self._step()
 File "/app/openhands/controller/agent_controller.py", line 436, in _step
   action = self.agent.step(self.state)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/openhands/agenthub/codeact_agent/codeact_agent.py", line 216, in step
   response = self.llm.completion(**params)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
   return copy(f, *args, **kw)
          ^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
   do = self.iter(retry_state=retry_state)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
   result = action(retry_state)
            ^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
   self._add_action_func(lambda rs: rs.outcome.result())
                                    ^^^^^^^^^^^^^^^^^^^
 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
   return self.__get_result()
          ^^^^^^^^^^^^^^^^^^^
 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
   raise self._exception
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
   result = fn(*args, **kwargs)
            ^^^^^^^^^^^^^^^^^^^
 File "/app/openhands/llm/llm.py", line 195, in wrapper
   resp: ModelResponse = completion_unwrapped(*args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1006, in wrapper
   raise e
 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 896, in wrapper
   result = original_function(*args, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2959, in completion
   raise exception_type(
         ^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2116, in exception_type
   raise e
 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 459, in exception_type
   raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}
2024-10-31 02:15:54,762 - ERROR - Error while running the agent: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}
2024-10-31 02:15:54,770 - ERROR - Traceback (most recent call last):
 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 552, in completion
   response = client.post(
              ^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 371, in post
   raise e
 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 357, in post
   response.raise_for_status()
 File "/app/.venv/lib/python3.12/site-packages/httpx/_models.py", line 763, in raise_for_status
   raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '401 Unauthorized' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
 File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1715, in completion
   response = anthropic_chat_completions.completion(
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 567, in completion
   raise AnthropicError(
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
 File "/app/openhands/controller/agent_controller.py", line 156, in start_step_loop
   await self._step()
 File "/app/openhands/controller/agent_controller.py", line 436, in _step
   action = self.agent.step(self.state)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/openhands/agenthub/codeact_agent/codeact_agent.py", line 216, in step
   response = self.llm.completion(**params)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
   return copy(f, *args, **kw)
          ^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
   do = self.iter(retry_state=retry_state)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
   result = action(retry_state)
            ^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
   self._add_action_func(lambda rs: rs.outcome.result())
                                    ^^^^^^^^^^^^^^^^^^^
 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
   return self.__get_result()
          ^^^^^^^^^^^^^^^^^^^
 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
   raise self._exception
 File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
   result = fn(*args, **kwargs)
            ^^^^^^^^^^^^^^^^^^^
 File "/app/openhands/llm/llm.py", line 195, in wrapper
   resp: ModelResponse = completion_unwrapped(*args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1006, in wrapper
   raise e
 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 896, in wrapper
   result = original_function(*args, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2959, in completion
   raise exception_type(
         ^^^^^^^^^^^^^^^
 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2116, in exception_type
   raise e
 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 459, in exception_type
   raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}
@apeixinho
Copy link
Author

tested with claude-3-5-sonnet since I don't have access to other Anthropic LLMs.

import os
from litellm import completion

# set env
os.environ["ANTHROPIC_API_KEY"] = "sgp_fd1b4edb60bf82b8_583b05eac759334bed1a12be31ff329d91dec2b5"

messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="claude-3-5-sonnet-20240620", messages=messages, stream=True)
for chunk in response:
    print(chunk["choices"][0]["delta"]["content"])  # same as openai format

with the following output:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Traceback (most recent call last):
  File "C:\Python311\Lib\site-packages\litellm\main.py", line 1327, in completion
    response = anthropic_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\site-packages\litellm\llms\anthropic.py", line 609, in completion
    raise AnthropicError(
litellm.llms.anthropic.AnthropicError: {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\Users\Student\Documents\testapi.py", line 8, in <module>
    response = completion(model="claude-3-5-sonnet-20240620", messages=messages, stream=True)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\site-packages\litellm\utils.py", line 3421, in wrapper
    raise e
  File "C:\Python311\Lib\site-packages\litellm\utils.py", line 3314, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\site-packages\litellm\main.py", line 2434, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\site-packages\litellm\utils.py", line 9972, in exception_type
    raise e
  File "C:\Python311\Lib\site-packages\litellm\utils.py", line 8757, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}

@SmartManoj
Copy link
Contributor

SmartManoj commented Oct 31, 2024

Don't share your keys. Could you generate a new key and test it?
If still fails, checkout free Google Gemini models. Click here to get the key.

--

Seems anthropic keys starts with sk-ant-api03. Did you get the key from here?

@neubig
Copy link
Contributor

neubig commented Oct 31, 2024

Yeah, I think the key is invalid for some reason. If it's not working with litellm it also won't work with OpenHands.

@teepean
Copy link

teepean commented Nov 3, 2024

I am experiencing the same issue with Claude. I did test Gemini and it worked without problems.

I did a test with openllm and it seems to work as well.

response = completion(model="claude-3-5-sonnet-20241022", messages=messages)
print(response)
ModelResponse(id='chatcmpl-fe784db2-358b-4b02-ad57-ac5d60fa64b4', choices=[Choices(finish_reason='stop', index=0, message=Message(content="I'm doing well, thanks! I'm Claude, an AI assistant. How are you today?", role='assistant', tool_calls=None, function_call=None))], created=1730630182, model='claude-3-5-sonnet-20241022', object='chat.completion', system_fingerprint=None, usage=Usage(completion_tokens=24, prompt_tokens=14, total_tokens=38, completion_tokens_details=None, prompt_tokens_details=PromptTokensDetailsWrapper(audio_tokens=None, cached_tokens=0, text_tokens=None, image_tokens=None), cache_creation_input_tokens=0, cache_read_input_tokens=0))

Last edit: Claude started working after a while so no issues at the moment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants