Skip to content

Conversation

@angelayi
Copy link
Contributor

@angelayi angelayi commented Oct 15, 2025

Purpose

Fixes test failure in https://buildkite.com/vllm/ci/builds/35052/steps/canvas?sid=0199e88b-b61d-448b-a9cd-b96b4c779a60#0199e88b-b74c-49ca-8496-a34e8750a30a/217-11082

Test Plan

    def test_pp(self):
        config = CompilationConfig(
            mode=CompilationMode.VLLM_COMPILE,
            custom_ops=["+rms_norm"],
            pass_config=PassConfig(
                enable_async_tp=False,
                enable_sequence_parallelism=True,
                enable_noop=True,
            ),
            use_inductor_graph_partition=True,
        )

        llm = LLM(
            model="meta-llama/Llama-3.2-1B-Instruct",
            gpu_memory_utilization=0.6,
            max_model_len=2048, 
            max_num_seqs=8,
            compilation_config=config,
            tensor_parallel_size=2,  
            pipeline_parallel_size=2,  
            distributed_executor_backend="mp",
        )

        # Simple generation test
        prompts = ["Hello, my name is"]
        outputs = llm.generate(prompts, SamplingParams(temperature=0, max_tokens=32))

        # Print the outputs
        print("-" * 60)
        for output in outputs:
            prompt = output.prompt
            generated_text = output.outputs[0].text
            print(f"Prompt: {prompt!r}")
            print(f"Output: {generated_text!r}")
        print("-" * 60)

cc @ProExpertProg @cascade812

Signed-off-by: angelayi <yiangela7@gmail.com>
@mergify mergify bot added the v1 label Oct 15, 2025
@angelayi angelayi marked this pull request as ready for review October 15, 2025 23:42
@ProExpertProg ProExpertProg enabled auto-merge (squash) October 15, 2025 23:47
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Oct 15, 2025
@vllm-bot vllm-bot merged commit e19b16d into vllm-project:main Oct 16, 2025
51 of 53 checks passed
mandy-li pushed a commit to mandy-li/vllm that referenced this pull request Oct 16, 2025
albertoperdomo2 pushed a commit to albertoperdomo2/vllm that referenced this pull request Oct 16, 2025
)

Signed-off-by: angelayi <yiangela7@gmail.com>
Signed-off-by: Alberto Perdomo <aperdomo@redhat.com>
albertoperdomo2 pushed a commit to albertoperdomo2/vllm that referenced this pull request Oct 16, 2025
)

Signed-off-by: angelayi <yiangela7@gmail.com>
Signed-off-by: Alberto Perdomo <aperdomo@redhat.com>
lywa1998 pushed a commit to lywa1998/vllm that referenced this pull request Oct 20, 2025
alhridoy pushed a commit to alhridoy/vllm that referenced this pull request Oct 24, 2025
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
)

Signed-off-by: angelayi <yiangela7@gmail.com>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
)

Signed-off-by: angelayi <yiangela7@gmail.com>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
)

Signed-off-by: angelayi <yiangela7@gmail.com>
Signed-off-by: 0xrushi <6279035+0xrushi@users.noreply.github.com>
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
)

Signed-off-by: angelayi <yiangela7@gmail.com>
Signed-off-by: 0xrushi <6279035+0xrushi@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants