Replies: 1 comment 1 reply
-
Nice timing, I just made a post about that with a summarizing technique. It doesn't help with the sending of tokens, but it sure helps with the receiving of them. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
For now Auto-Gpt seems to summerize and chunk webcontent or web pages before sending a request to OpenAI as we have a token limit. But it does not chunk large .txt or .py files when it tries to read them using read_file command making the used token size very large before sending a request to OpenAI.
Can this be fixed ?
Beta Was this translation helpful? Give feedback.
All reactions