Skip to content

Context restrictions #283

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
RegxeleratorAdmin opened this issue Mar 21, 2025 · 1 comment
Open

Context restrictions #283

RegxeleratorAdmin opened this issue Mar 21, 2025 · 1 comment
Labels
question Question about using the SDK

Comments

@RegxeleratorAdmin
Copy link

RegxeleratorAdmin commented Mar 21, 2025

Hi -

I wanted to clarify my understanding regarding the limits for supplying agents with context / input.
It seems there is a global limit for the string size of 256,000 characters independent of whether I provide input as context, as part of the input message or output from a function tool call.

Is there currently any way to supply the LLM with input of a string size larger than 256,000 characters?

Thanks in advance for feedback.

@RegxeleratorAdmin RegxeleratorAdmin added the question Question about using the SDK label Mar 21, 2025
@rm-openai
Copy link
Collaborator

Nope. Any restrictions here come directly from the model APIs, so would recommend truncating things.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question about using the SDK
Projects
None yet
Development

No branches or pull requests

2 participants