Skip to content

use github copilot in local litellm #11330

@735547951

Description

@735547951
  1. I have a litellm proxy running locally
    litellm --config config.yaml

  2. GitHub Copilot Chat Completion is can get response
    from litellm import completion

model_name = "github_copilot/gpt-4"

messages = [{"role": "user", "content": "Write a Python function to calculate fibonacci numbers"}]

extra_headers = {"editor-version": "vscode/1.85.1", "Copilot-Integration-Id": "vscode-chat"}

try:
response = completion(model=model_name, messages=messages, extra_headers=extra_headers)
print("Copilot output:\n", response)
except Exception as e:
print("Error:", e)

  1. https://docs.all-hands.dev/openhands/usage/llms/litellm-proxy
    To use LiteLLM proxy with OpenHands, but error.
    so where should I configure this extra_headers?

Please visit https://github.com/login/device and enter code D7E2-FD37 to authenticate.
06:10:54 - openhands:ERROR: conversation_summary.py:63 - Error generating conversation title: litellm.BadRequestError: Error code: 400 - {'error': {'message': 'litellm.BadRequestError: Github_copilotException - bad request: missing Editor-Version header for IDE auth. Received Model Group=github_copilot/gpt-4\nAvailable Model Group Fallbacks=None', 'type': None, 'param': None, 'code': '400'}}
06:10:54 - openhands:INFO: conversation_summary.py:138 - Generated title using truncation: ?
INFO: 127.0.0.1:52968 - "GET /api/conversations/3046b22a9ebf47d08e0f94b281037c8f HTTP/1.1" 200 OK
06:11:04 - openhands:INFO: standalone_conversation_manager.py:144 - ServerConversation 3046b22a9ebf47d08e0f94b281037c8f connected in 0.0002601146697998047 seconds
06:11:04 - openhands:INFO: standalone_conversation_manager.py:112 - Reusing detached conversation 3046b22a9ebf47d08e0f94b281037c8f
INFO: 127.0.0.1:41822 - "GET /api/conversations/3046b22a9ebf47d08e0f94b281037c8f/vscode-url HTTP/1.1" 200 OK
INFO: 127.0.0.1:41810 - "GET /api/conversations/3046b22a9ebf47d08e0f94b281037c8f/web-hosts HTTP/1.1" 200 OK
Please visit https://github.com/login/device and enter code 1DDE-7254 to authenticate.
06:11:58 - openhands:ERROR: agent_controller.py:365 - [Agent Controller 3046b22a9ebf47d08e0f94b281037c8f] Error while running the agent (session ID: 3046b22a9ebf47d08e0f94b281037c8f): litellm.BadRequestError: Error code: 400 - {'error': {'message': 'litellm.BadRequestError: Github_copilotException - bad request: missing Editor-Version header for IDE auth. Received Model Group=github_copilot/gpt-4\nAvailable Model Group Fallbacks=None', 'type': None, 'param': None, 'code': '400'}}. Traceback: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 725, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 653, in completion
) = self.make_sync_openai_chat_completion_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 149, in sync_wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 471, in make_sync_openai_chat_completion_request
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 453, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_legacy_response.py", line 364, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 287, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 1150, in create
return self._post(
^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1259, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1047, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'litellm.BadRequestError: Github_copilotException - bad request: missing Editor-Version header for IDE auth. Received Model Group=github_copilot/gpt-4\nAvailable Model Group Fallbacks=None', 'type': None, 'param': None, 'code': '400'}}

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingllmRelated to specific LLMs

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions