Skip to content

Fix sparse novograd #1970

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

WindQAQ
Copy link
Member

@WindQAQ WindQAQ commented Jul 5, 2020

Fixes #1087.

  • Gathering indices is done in resource_sparse_apply_keras_momentum.
  • Sparse weight decay isn't correct: only need to apply on the sparse grad, so have to use tf.gather to extract indices of var.
  • Original tests aren't robust: though it does call _resource_apply_sparse, but the test cases are dense.

@bot-of-gabrieldemarmiesse

@shreyashpatodia

You are owner of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

@WindQAQ WindQAQ requested a review from a team July 5, 2020 23:16
Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM Thanks!

@seanpmorgan seanpmorgan merged commit fc93fa1 into tensorflow:master Jul 6, 2020
ashutosh1919 pushed a commit to ashutosh1919/addons that referenced this pull request Jul 12, 2020
* Fix sparse novograd
* Fix sparse weight decay
* Reuse original tests
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
* Fix sparse novograd
* Fix sparse weight decay
* Reuse original tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

NovoGrad optimizer fails with embedding layer
5 participants