Skip to content

Conversation

rdyro
Copy link

@rdyro rdyro commented Dec 28, 2024

As pointed out by antra on the LLM discord rope_theta is never passed to the apply_rotary_emb function. This PR fixes that by passing the parameter.

Additionally, Llama3 requires a frequency correction: https://github.com/huggingface/transformers/blob/5c75087aeee7081025370e10d1f571a11600f1ae/src/transformers/modeling_rope_utils.py#L310

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant