Skip to content

Commit

Permalink
fix doc (vllm-project#622)
Browse files Browse the repository at this point in the history
  • Loading branch information
SiriusNEO authored Jul 31, 2023
1 parent 953f28c commit aa39e42
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion vllm/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ class SchedulerConfig:
a single iteration.
max_num_seqs: Maximum number of sequences to be processed in a single
iteration.
max_seq_len: Maximum length of a sequence (including prompt
max_model_len: Maximum length of a sequence (including prompt
and generated text).
"""

Expand Down
2 changes: 1 addition & 1 deletion vllm/engine/llm_engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -353,7 +353,7 @@ def _stop_sequences(self, seq_groups: List[SequenceGroup]) -> None:
if stopped:
continue

# Check if the sequence has reached max_seq_len.
# Check if the sequence has reached max_model_len.
if seq.get_len() > self.scheduler_config.max_model_len:
self.scheduler.free_seq(
seq, SequenceStatus.FINISHED_LENGTH_CAPPED)
Expand Down

0 comments on commit aa39e42

Please sign in to comment.