Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Litellm integration is broken #348

Open
enniofilicicchia opened this issue Jan 23, 2025 · 2 comments
Open

Litellm integration is broken #348

enniofilicicchia opened this issue Jan 23, 2025 · 2 comments

Comments

@enniofilicicchia
Copy link

It does not even enter the function
Opentelemetry._handle_sucess in litellm.integrations.opentelemetry.py

litellm.callbacks = ["otel"]
os.environ["LMNR_PROJECT_API_KEY"] = (
    "yyy"
)
os.environ["OTEL_EXPORTER"] = "otlp_grpc"
os.environ["OTEL_ENDPOINT"] = "http://mydomain.com:8000"
os.environ["OTEL_HEADERS"] = (
    f"authorization=Bearer {os.environ["LMNR_PROJECT_API_KEY"]}"
)
os.environ["DEBUG_OTEL"] = "true"

I followed the debug run and no logging method of Opentelemetry ever enters.

  1. LiteLLM is very popular, it would be cool for laminar to have a working reliable integration!
  2. In the meantime I would be happy with manually sending the POST request on the traces endpoint, but I cannot find anything about /api/v1/traces in the docs or the OpenAPI specs.
  3. As a side note it's unclear if the endpoint should include "/api", "/api/v1" or "/api/v1/traces", but this is not the main problem.
@dinmukhamedm
Copy link
Member

Hey, thanks for opening the issue! I think this is to do with how litellm updated their dependency management. Can you try adding the proxy extra of litellm, e.g.

pip install `litellm[proxy]`
# or uv add litellm --extra proxy
# or poetry add litellm --extra proxy

Also be sure to have the following packages installed: opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp

Please let us know, if this resolves your issue. I will update the docs both on our side and LiteLLM side


To your questions number 2 and 3, it must be /v1/traces, but it accepts the protobuf encoded payload (see more in our docs), so I don't think an API spec will help much. In addition, HTTP connection is less reliable, and so we advice against it, so the decision to not document that bit is half-concious

@dinmukhamedm
Copy link
Member

Also, for reference: BerriAI/litellm#8169

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants