Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix broken padding setting in HuggingMazeTokenizer #195

Merged
merged 4 commits into from
Oct 4, 2023

Commits on Oct 3, 2023

  1. new transformer lens broke padding!!!

    need a better actual fix for this, probably via some kind of
    setting in the saved model aaaaaaaaaaaaaaaaaaaaa
    
    going to bed for now
    mivanit committed Oct 3, 2023
    Configuration menu
    Copy the full SHA
    0785a09 View commit details
    Browse the repository at this point in the history

Commits on Oct 4, 2023

  1. fixed tokenizer padding side being overwritten

    as of transformer_lens 1.6.1, in pr TransformerLensOrg/TransformerLens#344
    the padding size works differently, and our left-padding was being overriden. this fixes it
    by doing some things in `ZanjHookedTransformer.__init__` and in the `HuggingMazeTokenizer`
    mivanit committed Oct 4, 2023
    Configuration menu
    Copy the full SHA
    7880202 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    8d4b26c View commit details
    Browse the repository at this point in the history
  3. format

    mivanit committed Oct 4, 2023
    Configuration menu
    Copy the full SHA
    cf8329a View commit details
    Browse the repository at this point in the history