Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimizer don't have apply() method #20801

Open
konstantin-frolov opened this issue Jan 23, 2025 · 4 comments
Open

Optimizer don't have apply() method #20801

konstantin-frolov opened this issue Jan 23, 2025 · 4 comments
Assignees
Labels
stale stat:awaiting response from contributor type:support User is asking for help / asking an implementation question. Stackoverflow would be better suited.

Comments

@konstantin-frolov
Copy link

konstantin-frolov commented Jan 23, 2025

Environment:

Python 3.12.7
Tensorflow 2.16.1
Keras 3.8.0

Problem:

All optimizers from tf.keras.optimizers do not have a method apply() to write the train routine from scratch.
But the docs states that it exists.

AttributeError                            Traceback (most recent call last)
Cell In[11], line 1
----> 1 getattr(tf.keras.optimizers.Optimizer(name="test"), "apply")

AttributeError: 'Optimizer' object has no attribute 'apply'

Possible solution

Need to use method apply_gradients(zip(gradients, model_parameters))

@dhantule dhantule added the type:support User is asking for help / asking an implementation question. Stackoverflow would be better suited. label Jan 27, 2025
@dhantule
Copy link
Contributor

@konstantin-frolov, thanks for reporting this.

If you're trying to apply gradients to variable you can use apply_gradients(zip(gradients, model_parameters)), you can read more about theapply_gradients method here. If that doesn't work please provide some minimal reproducible code. Attaching gist for reference.

@rameshdange5191
Copy link
Contributor

@konstantin-frolov and @dhantule

  1. Tensorflow v2.16.1's adam docs' View Source on Github (full XPath: /html/body/section/section/main/devsite-content/article/div[3]/div[2]/table/tbody/tr/td/a) links to Keras v3.3.3's adam optimizer. Shouldn't it link to Tensorflow v2.16.1's Keras' adam optimizer?
  2. Keras v3.3.3's base optimizer has the apply method and I didn't find the apply method in Tensorflow v2.16.1's Keras' optimizer_v2.
  3. Also Tensorflow v2.16.1's Keras' docs says the file was autogenerated. Could there be an error in the code that autogenerated the documentation?

@dhantule dhantule added the keras-team-review-pending Pending review by a Keras team member. label Feb 12, 2025
@hertschuh
Copy link
Collaborator

@konstantin-frolov

The base Optimizer class has an apply method defined here.

I tried to run the guide you linked and it worked for me.

AttributeError                            Traceback (most recent call last)
Cell In[11], line 1
----> 1 getattr(tf.keras.optimizers.Optimizer(name="test"), "apply")

AttributeError: 'Optimizer' object has no attribute 'apply'

How did you produce this error? What were the imports and code beforehand?


@rameshdange5191

  1. The link is correct. Starting from Tensorflow 2.16, tf.keras refers to Keras 3 in the keras repo repo. With Tensorflow 2.15 and earlier, tf.keras was Keras 2, which is now in the tf_keras repo. More info in this page.

  2. The code you linked (under tensorflow/python/keras) is actually never used and should have been removed.

  3. Yes, it is autogenerated from the keras repo, not tensorflow/python/keras.

@hertschuh hertschuh removed the keras-team-review-pending Pending review by a Keras team member. label Feb 24, 2025
Copy link

This issue is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale label Mar 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale stat:awaiting response from contributor type:support User is asking for help / asking an implementation question. Stackoverflow would be better suited.
Projects
None yet
Development

No branches or pull requests

5 participants