-
Notifications
You must be signed in to change notification settings - Fork 449
Add LayerNorm support for Vivado #1110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Add LayerNorm support for Vivado #1110
Conversation
For the "broken" diffs, we should look to see what the whitespace error is and fix it before merging. |
I did a bit of digging and it's not a whitespace problem but rather the file endings are improperly encoded. Likely the person we got the code from was using a Windows machine. @rianbrooksflynn, you can install the |
We should squash the commits when we merge this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, could use some minor cosmetics.
'table_t', NamedType(name=layer.name + '_table_t', precision=FixedPrecisionType(width=16, integer=6)) | ||
) | ||
if 'table_size' not in layer.attributes: | ||
layer.set_attr('table_size', 4096) # table size |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These attributes should be set as default in _register_layer_attributes()
not here.
Also, is 4096 necessary for this implementation to work? All other tables are 1024.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately according to my tests, 4096 is necessary to achieve absolute differences of < 0.05 in accuracy.
Thanks @vloncar for your review and apologies that it took me until now to get around to implementing your feedback! I would appreciate a second look from you at this pull request. |
pre-commit.ci autofix |
Fixed the latest merge conflicts. @vloncar, can you check if Rian's fixes are enough to have this merged? |
Description
This PR adds support for Layer Normalization using either Keras or PyTorch with the Vivado backend in
io_parallel
mode.This implementation uses a lookup table for inverse square root; the inputs to the lookup table follow a logarithmic distribution for better accuracy.
Tests have been added for both Keras and Pytorch parsing.
Credit is due to @Ethan0Jiang and @LostEcho365 (Zhixing Jiang and Dennis Yin) for their Vivado implementation and Keras parsing support; my contributions were making a change to the inverse square root lookup table implementation, implementing PyTorch support, and adding unit tests. (Here's a link to their pre-print.) The original code authors have given permission for their code to be merged into hls4ml.
Linked issue: #1109
Type of change
Tests
Two unit tests added:
test/pytest/test_layernorm.py
andtest/pytest/test_layernorm_pytorch.py
Checklist
pre-commit
on the files I edited or added.