[Enhancement] Add Configuration and Support for Open Telemetry #7632
Replies: 3 comments 3 replies
-
+1 on this. We use different LLM models and would like some visibility on how how they are utilized by our users. |
Beta Was this translation helpful? Give feedback.
-
I think at least some of the underlying client libraries are already otel-instrumented, for what it's worth |
Beta Was this translation helpful? Give feedback.
-
A friend and I have taken a shot at implementing this & we have something working for both the backend and the front end. It's mostly additions minus some changes to the environment model. Happy to provide more info if it would help drive approval for this feature request! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What features would you like to see added?
Description
It would be nice to be able to configure an OTEL endpoint to send traces & log data for centralized monitoring in an environment that uses an observability layer (e.g., Datadog, New Relic, Honeycomb.io, etc.)
This would improve the supportability of LibreChat deployments that are able to enable this feature.
More Details
Ideally both the backend and UI would be able to emit OTEL messages, each optionally driven by configuration. (e.g., one could enable only the backend or only the UI)
For the backend, there would need to be an integration point between the winston logger so that it also emits OTEL messages.
This would involve taking additional dependencies on the following libraries:
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/api-logs": "^0.201.0",
"@opentelemetry/auto-instrumentations-node": "^0.59.0",
"@opentelemetry/auto-instrumentations-web": "^0.47.0",
"@opentelemetry/exporter-logs-otlp-http": "^0.201.1",
"@opentelemetry/exporter-metrics-otlp-proto": "^0.201.1",
"@opentelemetry/exporter-trace-otlp-proto": "^0.201.1",
"@opentelemetry/instrumentation-express": "^0.50.0",
"@opentelemetry/instrumentation-http": "^0.201.1",
"@opentelemetry/instrumentation-winston": "^0.46.0",
"@opentelemetry/resources": "^2.0.1",
"@opentelemetry/sdk-logs": "^0.201.1",
"@opentelemetry/sdk-metrics": "^2.0.1",
"@opentelemetry/sdk-node": "^0.201.1",
"@opentelemetry/sdk-trace-base": "^2.0.1",
"@opentelemetry/sdk-trace-node": "^2.0.1",
"@opentelemetry/sdk-trace-web": "^2.0.1",
"@opentelemetry/semantic-conventions": "^1.34.0",
"@opentelemetry/winston-transport": "^0.12.0"
Why this feature
It optionally improves the supportability of LibreChat deployments
Which components are impacted by your request?
UI
Backend
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions