Skip to content

group normalization layer test #766

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 13 commits into from
Mar 4, 2020
Merged

Conversation

autoih
Copy link
Member

@autoih autoih commented Dec 13, 2019

Address #731

@autoih autoih changed the title [wip] group normalization layer test group normalization layer test Feb 11, 2020
@autoih
Copy link
Member Author

autoih commented Feb 18, 2020

Hi @Squadrick, can you please help review when you have time? Thank you.

Copy link
Member

@Squadrick Squadrick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@autoih Sorry about the delay. Just a few small changes.

Also, use self.assert* functions instead of np.testing.assert*.

@googlebot

This comment has been minimized.

@gabrieldemarmiesse
Copy link
Member

@googlebot I consent

@googlebot
Copy link

CLAs look good, thanks!

ℹ️ Googlers: Go here for more info.

@gabrieldemarmiesse
Copy link
Member

gabrieldemarmiesse commented Feb 26, 2020

@autoih I've merged master into your branch to update it and fixed any formatting/conflicts it might have. If you need to do some more modifications, please do git pull beforehand.

@autoih
Copy link
Member Author

autoih commented Feb 27, 2020

Thanks @gabrieldemarmiesse for the reminder. Also, can you please guide me to fix error?

@gabrieldemarmiesse
Copy link
Member

It seems it was just an internet connection issue. I'll restart the tests and hopefully it will work.

@autoih
Copy link
Member Author

autoih commented Feb 27, 2020

Thanks @gabrieldemarmiesse.

Copy link
Member

@Squadrick Squadrick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hard code the initialize random seed since you're using np.random*. I'm gonna trigger GPU tests.

Comment on lines 258 to 259
out -= tf.keras.backend.eval(norm.beta)
out /= tf.keras.backend.eval(norm.gamma)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You've used both self.evaluate and tf.keras.backend.eval. Stick to the former throughout the PR.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Squadrick. Do you also suggest using seed for tf.random, like here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, that too. Forgot to mention it.

Copy link
Member

@gabrieldemarmiesse gabrieldemarmiesse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the pull request!

@gabrieldemarmiesse gabrieldemarmiesse dismissed Squadrick’s stale review March 4, 2020 10:49

Review has been adressed.

@gabrieldemarmiesse gabrieldemarmiesse merged commit 2f00a76 into tensorflow:master Mar 4, 2020
@autoih autoih deleted the gnorm branch March 27, 2020 23:51
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
* init testing

* use np assert_allclose

* add no center and scale test

* sanitycheck

* add test

* revise based on comments

* add random seed

Co-authored-by: Gabriel de Marmiesse <gabrieldemarmiesse@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants