Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Forking Breaks If Over 150k Tokens #5692

Open
ColbySerpa opened this issue Feb 3, 2025 · 4 comments
Open

[Bug] Forking Breaks If Over 150k Tokens #5692

ColbySerpa opened this issue Feb 3, 2025 · 4 comments
Labels
🐛 Bug Something isn't working | 缺陷

Comments

@ColbySerpa
Copy link

You can reproduce this bug easily with Claude if your original chat is over 150k tokens and then you fork it. In the fork, as you approach 170k and 180k tokens, every message you send will fail because it says that the 200k token limit has been reached. For some reason, the token counting is inaccurate in the fork.

To get around it, I have to delete the fail message and press retry — then it will work. Every single time I have to send a message in the fork I have to delete the error reply message and then retry which is incredibly annoying and slows me down. It would be great if the forking feature was fixed.

If you have any questions please let me know.

@lobehubbot
Copy link
Member

👀 @ColbySerpa

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Copy link

dosubot bot commented Feb 3, 2025

Hey @ColbySerpa! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together!

The issue you're experiencing with token counting in forked chats might be related to how tokens are calculated and managed in the framework. The token counting logic is implemented in the src/features/ChatInput/ActionBar/Token/TokenTag.tsx file, where various components like input, messageString, systemRole, historySummary, and toolsString are used to compute the total token count [1].

Additionally, the getMessagesTokenCount function in src/store/chat/helpers.ts calculates the token count for a list of messages by encoding their content [2]. There are tests for this function to ensure it returns the correct token count, but there is no specific mention of tests related to token management in forked chats [3].

The error you're encountering might be due to a discrepancy in how tokens are counted in forked chats compared to the original chat. It could be beneficial to review the token counting logic in these files to identify any potential issues or inaccuracies specific to forked chats.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the 🐛 Bug Something isn't working | 缺陷 label Feb 3, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Feb 3, 2025
@arvinxx
Copy link
Contributor

arvinxx commented Feb 3, 2025

it seems a bug, I will figure it out

@ColbySerpa
Copy link
Author

it seems a bug, I will figure it out

@arvinxx Thank you so much! It will make my life 1000x easier. 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

3 participants