Skip to content

Update prompt token usage for input detection on chat completions #295

Open
@evaline-ju

Description

@evaline-ju

Description

As an orchestrator user, I want to know how many prompt tokens were used in the /chat/completions-detection endpoint even when there are input detections found, so that I can know how many prompt tokens were used for the endpoint, for billing or informational purposes.

Discussion

usage.prompt_tokens ref. https://platform.openai.com/docs/api-reference/chat/object#chat/object-usage. Currently, a separate tokenization call is done for text generation to get input token information, so the "tokenization" equivalent for chat completions may have to be investigated

Acceptance Criteria

  • Unit tests cover new/changed code
  • Examples build against new/changed code
  • READMEs are updated
  • Type of semantic version change is identified

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions