-
Notifications
You must be signed in to change notification settings - Fork 344
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Response in medium model decreases overtime. #84
Comments
Try to tweak the |
Same problem I'm facing. tweaking |
You might be correct, and this might be an issue with the length of the
Try to tweak the As for what would cause this issue, I don't really know what to say. |
Thanks for the help but it seems kinda weird way to do. RAM will be consumed by this a lot if bot will talk for a long time. There should be a way to trim the history token. Like if it crossed last 5 token history then it should drop the oldest chat history token. |
I'm having the same issue, increaing Before this it will start to give shorter and shorter answers. Any idea what can cause such a behavior? I'm using the medium-sized model with a max_length of 2000. I get conversations like the one below. What causes this eventual shortening and then disappearance of replies and what can I do to change this? I'd like to be able to hold a conversation indefinitely.
|
I think the problem is that you have to lower the max history length to below 500, or else DialoGPT gets stuck
|
I'm using a modified version of the example code provided in the huggingface website.
After some few lines, the responses starts become shorter and shorter until it just doesn't output anything anymore.
I tried changing the max_length to about 5000 and it doesn't seem to do anything. I've tried getting rid of the history (i.e., just using the
new_user_input_ids
variable) and it seemed to have fix the issue but that obviously leads me to a very random-like response as it has no context on what we're talking aboutThe text was updated successfully, but these errors were encountered: