-
Notifications
You must be signed in to change notification settings - Fork 9
Description
Is your feature request related to a problem? Please describe.
The OpenAI platform released an alternative to the completions endpoint called responses last month
https://platform.openai.com/docs/api-reference/responses
They are now using this by default in the playground (which generates content) as well as the new codex api written in typescript https://github.com/openai/codex
I've successfully run this with EDOT SDK Node.js, however I only get HTTP spans for the operations because it, like the OpenAI Agents SDK, uses the responses api instead of the completions one.
Describe the solution you'd like
Logs, metrics and tracing for responses like we do for completions.
TODO feature checkboxes
Describe alternatives you've considered
Try to modify https://github.com/openai/codex to support completions instead
Additional context
announcement: https://community.openai.com/t/introducing-the-responses-api/1140929
local inference support of core features
- litellm https://github.com/BerriAI/litellm/releases/tag/v1.67.0-stable
- llama-stack https://github.com/meta-llama/llama-stack/releases/tag/v0.2.4
- masaic-ai https://github.com/masaic-ai-platform/open-responses/releases/tag/v0.1.3-M2
- ollama Support for OpenAI Responses API (for Codex CLI compatibility) ollama/ollama#10309
- vllm [Feature]: Support openai responses API interface vllm-project/vllm#14721
other javascript/typescript/node SDKs
This is the same issue as EDOT Python SDK elastic/elastic-otel-python-instrumentations#62