You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 10, 2025. It is now read-only.
I think the code should be compatible with latest (stable) version of tensorflow. AdamWeightDecay is a subclass of the default Adam optimizer in keras, so it is fine to have different input arguments for AdamWeightDecay as long as that for Adam is consistent (which it is).
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi, I would like to know if the versions of libraries in the
requirements.txt
can be provided. Currently I am using the latest TF (2.11) that seems to have breaking change in the optimizer. Thus, the initialization for base class inAdamWeightDecay()
(https://github.com/google-research/pix2seq/blob/main/models/model_utils.py#L104) has incorrect arguments compared to the corresponding ones in https://github.com/keras-team/keras/blob/v2.11.0/keras/optimizers/optimizer_experimental/adam.py#L86. Thanks!The text was updated successfully, but these errors were encountered: