Skip to content

fixed some gramma errors and clearified some definitions in Article.svelte#70

Open
damoebe wants to merge 2 commits intopoloclub:mainfrom
damoebe:main
Open

fixed some gramma errors and clearified some definitions in Article.svelte#70
damoebe wants to merge 2 commits intopoloclub:mainfrom
damoebe:main

Conversation

@damoebe
Copy link
Contributor

@damoebe damoebe commented Feb 28, 2026

I added a hint that Transformer Explainer only focuses on Decoder-Only Transformers, that are used for GPTs to avoid possible confusion with other transformer types like Encoder-Decoder Transformers or Encoder-Only Transformers. Another change I made is in the definition of "Scaling · Mask", as the mask applied in Transformer Explainer is a causal mask, which differs from a padding mask.

Feel free to change my changes :)

Clarify the explanation of the scaling and masking process in attention scores, including the use of padding-masks.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant