Skip to content

Relax gradient assertion for mish activation #1415

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 26, 2020

Conversation

seanpmorgan
Copy link
Member

Dug into this a bit and found that this check could conceivably fail for TF2.1. It does empirically appear more common on TF2.2, and I don't have an answer about what could have changed. As previously discussed we have no guarantee the py gradient should behave the exact same at this precision:
#1220 (comment)

Part of #1320

@bot-of-gabrieldemarmiesse

@digantamisra98

You are owner of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

@gabrieldemarmiesse
Copy link
Member

I also noticed that only very few values were different (<5%) when printing the arrays. I guess it's ok then.

@gabrieldemarmiesse gabrieldemarmiesse merged commit 98a4a70 into tensorflow:master Mar 26, 2020
@seanpmorgan seanpmorgan deleted the mish-2.2 branch March 26, 2020 12:26
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants