Skip to content

Commit

Permalink
Fixing baichuan override. (#2158)
Browse files Browse the repository at this point in the history
  • Loading branch information
Narsil authored and glegendre01 committed Jul 2, 2024
1 parent b0e8592 commit b01c689
Showing 1 changed file with 5 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,11 @@ def __init__(
self.hidden_size = config.hidden_size
self.head_size = self.hidden_size // self.num_heads

# Setting defaults for baichuan custom config which doesn't apply them.
config.rope_theta = getattr(config, "rope_theta", 10000)
config.num_key_value_heads = getattr(
config, "num_key_value_heads", config.num_attention_heads
)
self.rotary_emb = PositionRotaryEmbedding.static(
config=config,
dim=self.head_size,
Expand Down

0 comments on commit b01c689

Please sign in to comment.