Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix broken padding setting in HuggingMazeTokenizer #195

Merged
merged 4 commits into from
Oct 4, 2023

Conversation

mivanit
Copy link
Member

@mivanit mivanit commented Oct 3, 2023

as of transformer_lens 1.6.1, in pr TransformerLensOrg/TransformerLens#344
the padding size works differently, and our left-padding was being overriden. this fixes it
by doing some things in ZanjHookedTransformer.__init__ and in the HuggingMazeTokenizer

need a better actual fix for this, probably via some kind of
setting in the saved model aaaaaaaaaaaaaaaaaaaaa

going to bed for now
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

as of transformer_lens 1.6.1, in pr TransformerLensOrg/TransformerLens#344
the padding size works differently, and our left-padding was being overriden. this fixes it
by doing some things in `ZanjHookedTransformer.__init__` and in the `HuggingMazeTokenizer`
@mivanit mivanit changed the title new transformer lens broke padding!!! fix broken padding setting in HuggingMazeTokenizer due to new transformer_lens version Oct 4, 2023
@mivanit mivanit changed the title fix broken padding setting in HuggingMazeTokenizer due to new transformer_lens version fix broken padding setting in HuggingMazeTokenizer Oct 4, 2023
@mivanit mivanit mentioned this pull request Oct 4, 2023
3 tasks
@mivanit mivanit merged commit a08bfd7 into main Oct 4, 2023
4 checks passed
@mivanit mivanit deleted the fix-issue-of-some-kind branch October 4, 2023 05:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant