Skip to content

Commit 78b0513

Browse files
Llama whitespace fix (vllm-project#36)
1 parent f6fb119 commit 78b0513

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

vllm/model_executor/models/llama.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -236,6 +236,7 @@ def forward(
236236
kv_cache=kv_cache,
237237
attn_metadata=attn_metadata,
238238
)
239+
239240
# Fully Connected
240241
hidden_states, residual = self.post_attention_layernorm(
241242
hidden_states, residual)

0 commit comments

Comments
 (0)