-
Notifications
You must be signed in to change notification settings - Fork 170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
save the trained agent for reuse #74
Comments
Hi. We are actively working on this now. It should come out in a few days. As a temporary solution using I will notify you here when our solution is out. Thanks! |
Update: we did put quite some time on this already but other issues got in the way. I hope to get back to it within the next couple of weeks. |
Hi @rodrigodesalvobraz ! I hope you're doing well. I wanted to check in on the status of this issue, as it seems to have been pending for some time. Could you please provide an update? Thank you in advance for your help! |
Thanks for checking. Yes, it's been pending for a while, and in fact most of the work has already been done, but then we had to focus on other things. |
Thank you for the quick reply, @rodrigodesalvobraz Cheers! |
@Luke3Py , didn't forget about this. Just working with quite limited bandwidth. |
Hello! is there a status update for this issue? I would also love to be able to save the agent, Running into some odd errors when trying to use pickle. dump |
Hi @sebastiancoombs . Sorry, we have very little bandwidth for the open source of the project and more recently I have been spending it on the pending Pull Request. But your request increases the priority here, so I will try harder to get back to it. Sorry about the delay. |
@rodrigodesalvobraz No Problem! This Lib is awesome, I appreciate the awesome work. thanks for the update! |
Thank you. Your request did motivate me to get back to it today, so hopefully this will be completely soon. |
I would like to know whether there is a way to save the trained agent, as I did not see this part in the tutorial, nor did I find any related methods in the agent class.
The text was updated successfully, but these errors were encountered: