Skip to content

【BUGFIXED】Fix the mtp inference error when the prompt is long or Chinese. #1468

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: v0.9.1-dev
Choose a base branch
from

Conversation

Irving11-BKN
Copy link

@Irving11-BKN Irving11-BKN commented Jun 26, 2025

What this PR does / why we need it?

1Revert the code to support PD separation. The bug is fixed and model with mtp can output normally.

1. Revert the code to support PD separation.
The bug is fixed and model with mtp can output normally.

Signed-off-by: curryliu <99582471+Irving11-BKN@users.noreply.github.com>
@wangxiyuan
Copy link
Collaborator

CI failure due to transformers version problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants