Skip to content

Commit

Permalink
sync error message in test_generate_with_prompt_ids_max_length
Browse files Browse the repository at this point in the history
  • Loading branch information
benniekiss committed Sep 12, 2024
1 parent 8f11346 commit 4da7924
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions tests/models/whisper/test_modeling_whisper.py
Original file line number Diff line number Diff line change
Expand Up @@ -1349,12 +1349,12 @@ def test_generate_with_prompt_ids_max_length(self):

with self.assertRaisesRegex(
ValueError,
f"The length of `decoder_input_ids` equal `prompt_ids` plus special start tokens is {decoder_input_ids.shape[-1]}, and the `max_new_tokens` "
f"is {max_new_tokens}. Thus, the combined length of "
f"The length of `decoder_input_ids`, including special start tokens, prompt tokens, and previous tokens, is {decoder_input_ids.shape[-1]}, "
f" and `max_new_tokens` is {max_new_tokens}. Thus, the combined length of "
f"`decoder_input_ids` and `max_new_tokens` is: {max_new_tokens + decoder_input_ids.shape[-1]}. This exceeds the "
f"`max_target_positions` of the Whisper model: {config.max_target_positions}. "
f"`max_target_positions` of the Whisper model: {self.config.max_target_positions}. "
"You should either reduce the length of your prompt, or reduce the value of `max_new_tokens`, "
f"so that their combined length is less than {config.max_target_positions}.",
f"so that their combined length is less than {self.config.max_target_positions}.",
):
model.generate(input_features, max_new_tokens=max_new_tokens, prompt_ids=prompt_ids)

Expand Down

0 comments on commit 4da7924

Please sign in to comment.