Skip to content

prompt caching documentation items missing in java azure inference api #44187

@Plawn

Description

@Plawn

Type of issue

Typo

Description

On the https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/prompt-caching

in the result example:

{
  "created": 1729227448,
  "model": "o1-preview-2024-09-12",
  "object": "chat.completion",
  "service_tier": null,
  "system_fingerprint": "fp_50cdd5dc04",
  "usage": {
    "completion_tokens": 1518,
    "prompt_tokens": 1566,
    "total_tokens": 3084,
    "completion_tokens_details": {
      "audio_tokens": null,
      "reasoning_tokens": 576
    },
    "prompt_tokens_details": {
      "audio_tokens": null,
      "cached_tokens": 1408
    }
  }
}

We have the key cached_tokens which is not present in the java azure inference sdk

Can you add it in the sdk ?

Page URL

https://learn.microsoft.com/en-us/java/api/overview/azure/ai-inference-readme?view=azure-java-preview

Content source URL

https://github.com/Azure/azure-docs-sdk-java/blob/master/docs-ref-services/preview/ai-inference-readme.md

Document Version Independent Id

c69d7a40-f774-f7ac-9f05-7ca51551f35e

Article author

@azure-sdk

Metadata

  • ID: 45c57ccc-a843-5f28-dc24-64cb9048778f
  • PlatformId: cf3cb4f7-dc21-fce0-4485-8f9d7237b3a2
  • Service: ai

Metadata

Metadata

Assignees

Labels

OpenAIcustomer-reportedIssues that are reported by GitHub users external to the Azure organization.feature-requestThis issue requires a new behavior in the product in order be resolved.issue-addressedWorkflow: The Azure SDK team believes it to be addressed and ready to close.questionThe issue doesn't require a change to the product in order to be resolved. Most issues start as that

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions