Skip to content

Conversation

@sfeng33
Copy link
Contributor

@sfeng33 sfeng33 commented Sep 10, 2025

Purpose

This PR completes the migration to MultiModalFeatureSpec by removing legacy fields (mm_positions, mm_kwargs, mm_hashes) that were temporarily kept for backward compatibility. This is a follow-up to #23779 which introduced the unified MultiModalFeatureSpec data structure.
Fixes: #23872

Test Plan

python -m vllm.entrypoints.openai.api_server \
    --model llava-hf/llava-1.5-7b-hf 

curl -X POST "http://localhost:8000/v1/chat/completions" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "llava-hf/llava-1.5-7b-hf",
    "messages": [
      {
        "role": "user",
        "content": [
          {"type": "text", "text": "What do you see in this image?"},
          {
            "type": "image_url",
            "image_url": {
              "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/car.jpg"
            }
          }
        ]
      }
    ],
    "max_tokens": 100,
    "temperature": 0
  }'

@mergify mergify bot added v1 tpu Related to Google TPUs labels Sep 10, 2025
@sfeng33 sfeng33 marked this pull request as ready for review September 10, 2025 01:22
@sfeng33 sfeng33 changed the title [Multimodal] Migrate to MultiModalFeatureSpec [Multimodal] Remove legacy multimodal fields in favor of MultiModalFeatureSpec Sep 10, 2025
Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for cleaning this up!

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) September 10, 2025 18:02
@DarkLight1337 DarkLight1337 added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 10, 2025
@DarkLight1337
Copy link
Member

PTAL at the failing test

@mergify
Copy link

mergify bot commented Sep 10, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @sfeng33.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Sep 10, 2025
Signed-off-by: sfeng33 <4florafeng@gmail.com>
Signed-off-by: sfeng33 <4florafeng@gmail.com>
Signed-off-by: sfeng33 <4florafeng@gmail.com>
auto-merge was automatically disabled September 12, 2025 00:06

Head branch was pushed to by a user without write access

@mergify mergify bot removed the needs-rebase label Sep 12, 2025
@sfeng33
Copy link
Contributor Author

sfeng33 commented Sep 12, 2025

Hey @zhewenl, this PR triggers the BC lint error due to the migration in NewRequestData's fields, could you advice on what to do here? I couldn't find the info from the wiki.

@zhewenl
Copy link
Collaborator

zhewenl commented Sep 12, 2025

Hey @zhewenl, this PR triggers the BC lint error due to the migration in NewRequestData's fields, could you advice on what to do here? I couldn't find the info from the wiki.

You can follow the wiki here: https://github.com/pytorch/test-infra/wiki/BC-Linter#suppression

Either asking the committers to add the label to your PR or add the suppression in commit message should work

@sfeng33
Copy link
Contributor Author

sfeng33 commented Sep 12, 2025

Hey @zhewenl, this PR triggers the BC lint error due to the migration in NewRequestData's fields, could you advice on what to do here? I couldn't find the info from the wiki.

You can follow the wiki here: https://github.com/pytorch/test-infra/wiki/BC-Linter#suppression

Either asking the committers to add the label to your PR or add the suppression in commit message should work

Thanks for your reply! I guess my question is - how to know if the change's linter can be safely suppressed? Since you just merged the linter change to NewRequestData, is it that there are some steps to take before making changes to the class?

@zhewenl
Copy link
Collaborator

zhewenl commented Sep 12, 2025

Hey @zhewenl, this PR triggers the BC lint error due to the migration in NewRequestData's fields, could you advice on what to do here? I couldn't find the info from the wiki.

You can follow the wiki here: https://github.com/pytorch/test-infra/wiki/BC-Linter#suppression

Either asking the committers to add the label to your PR or add the suppression in commit message should work

Thanks for your reply! I guess my question is - how to know if the change's linter can be safely suppressed? Since you just merged the linter change to NewRequestData, is it that there are some steps to take before making changes to the class?

I see, in this case I think you can just suppress it - we are still piloting this feature and we are planning to roll out public interfaces - if the changes are approved by commiters, it's okay to suppress it.

@sfeng33
Copy link
Contributor Author

sfeng33 commented Sep 12, 2025

Hey @zhewenl, this PR triggers the BC lint error due to the migration in NewRequestData's fields, could you advice on what to do here? I couldn't find the info from the wiki.

You can follow the wiki here: https://github.com/pytorch/test-infra/wiki/BC-Linter#suppression

Either asking the committers to add the label to your PR or add the suppression in commit message should work

Thanks for your reply! I guess my question is - how to know if the change's linter can be safely suppressed? Since you just merged the linter change to NewRequestData, is it that there are some steps to take before making changes to the class?

I see, in this case I think you can just suppress it - we are still piloting this feature and we are planning to roll out public interfaces - if the changes are approved by commiters, it's okay to suppress it.

Got you thanks!

skyloevil pushed a commit to skyloevil/vllm that referenced this pull request Sep 13, 2025
…atureSpec (vllm-project#24548)

Signed-off-by: sfeng33 <4florafeng@gmail.com>
skyloevil pushed a commit to skyloevil/vllm that referenced this pull request Sep 13, 2025
dsxsteven pushed a commit to dsxsteven/vllm_splitPR that referenced this pull request Sep 15, 2025
…atureSpec (vllm-project#24548)

Signed-off-by: sfeng33 <4florafeng@gmail.com>
dsxsteven pushed a commit to dsxsteven/vllm_splitPR that referenced this pull request Sep 15, 2025
bbartels pushed a commit to bbartels/vllm that referenced this pull request Sep 15, 2025
…ect#24548 (vllm-project#24754)

Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: bbartels <benjamin@bartels.dev>
Yikun pushed a commit to vllm-project/vllm-ascend that referenced this pull request Sep 20, 2025
)

### What this PR does / why we need it?
1. This pr bump vllm commit to
vllm-project/vllm@6d8246a
2. fix upstream changes vllm-project/vllm#24548
abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable
3. fix metadata_builder changes introduced by
vllm-project/vllm#23693
4. fix `structured_outputs_config` changes introduced by
vllm-project/vllm#22772
5. fix `moe_config` changes introduced by
vllm-project/vllm#22537

Co-authored-by:  MengqingCao <cmq0113@163.com>
Co-authored-by:  Yikun Jiang <yikunkero@gmail.com>


- vLLM version: v0.10.2
- vLLM main:
vllm-project/vllm@c60e613

---------

Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: MengqingCao <cmq0113@163.com>
weijinqian0 pushed a commit to weijinqian0/vllm-ascend that referenced this pull request Sep 22, 2025
…lm-project#2907)

### What this PR does / why we need it?
1. This pr bump vllm commit to
vllm-project/vllm@6d8246a
2. fix upstream changes vllm-project/vllm#24548
abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable
3. fix metadata_builder changes introduced by
vllm-project/vllm#23693
4. fix `structured_outputs_config` changes introduced by
vllm-project/vllm#22772
5. fix `moe_config` changes introduced by
vllm-project/vllm#22537

Co-authored-by:  MengqingCao <cmq0113@163.com>
Co-authored-by:  Yikun Jiang <yikunkero@gmail.com>

- vLLM version: v0.10.2
- vLLM main:
vllm-project/vllm@c60e613

---------

Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: MengqingCao <cmq0113@163.com>
Mercykid-bash pushed a commit to Mercykid-bash/vllm-ascend that referenced this pull request Sep 22, 2025
…lm-project#2907)

### What this PR does / why we need it?
1. This pr bump vllm commit to
vllm-project/vllm@6d8246a
2. fix upstream changes vllm-project/vllm#24548
abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable
3. fix metadata_builder changes introduced by
vllm-project/vllm#23693
4. fix `structured_outputs_config` changes introduced by
vllm-project/vllm#22772
5. fix `moe_config` changes introduced by
vllm-project/vllm#22537

Co-authored-by:  MengqingCao <cmq0113@163.com>
Co-authored-by:  Yikun Jiang <yikunkero@gmail.com>

- vLLM version: v0.10.2
- vLLM main:
vllm-project/vllm@c60e613

---------

Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Che Ruan <cr623@ic.ac.uk>
Mercykid-bash pushed a commit to Mercykid-bash/vllm-ascend that referenced this pull request Sep 22, 2025
…lm-project#2907)

### What this PR does / why we need it?
1. This pr bump vllm commit to
vllm-project/vllm@6d8246a
2. fix upstream changes vllm-project/vllm#24548
abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable
3. fix metadata_builder changes introduced by
vllm-project/vllm#23693
4. fix `structured_outputs_config` changes introduced by
vllm-project/vllm#22772
5. fix `moe_config` changes introduced by
vllm-project/vllm#22537

Co-authored-by:  MengqingCao <cmq0113@163.com>
Co-authored-by:  Yikun Jiang <yikunkero@gmail.com>

- vLLM version: v0.10.2
- vLLM main:
vllm-project/vllm@c60e613

---------

Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Che Ruan <cr623@ic.ac.uk>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
…atureSpec (vllm-project#24548)

Signed-off-by: sfeng33 <4florafeng@gmail.com>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
…atureSpec (vllm-project#24548)

Signed-off-by: sfeng33 <4florafeng@gmail.com>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
…ect#24548 (vllm-project#24754)

Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
choprahetarth pushed a commit to Tandemn-Labs/vllm that referenced this pull request Oct 11, 2025
…atureSpec (vllm-project#24548)

Signed-off-by: sfeng33 <4florafeng@gmail.com>
choprahetarth pushed a commit to Tandemn-Labs/vllm that referenced this pull request Oct 11, 2025
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Oct 21, 2025
…lm-project#2907)

### What this PR does / why we need it?
1. This pr bump vllm commit to
vllm-project/vllm@6d8246a
2. fix upstream changes vllm-project/vllm#24548
abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable
3. fix metadata_builder changes introduced by
vllm-project/vllm#23693
4. fix `structured_outputs_config` changes introduced by
vllm-project/vllm#22772
5. fix `moe_config` changes introduced by
vllm-project/vllm#22537

Co-authored-by:  MengqingCao <cmq0113@163.com>
Co-authored-by:  Yikun Jiang <yikunkero@gmail.com>


- vLLM version: v0.10.2
- vLLM main:
vllm-project/vllm@c60e613

---------

Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: MengqingCao <cmq0113@163.com>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
…atureSpec (vllm-project#24548)

Signed-off-by: sfeng33 <4florafeng@gmail.com>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
…ect#24548 (vllm-project#24754)

Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
hwhaokun pushed a commit to hwhaokun/vllm-ascend that referenced this pull request Nov 19, 2025
…lm-project#2907)

### What this PR does / why we need it?
1. This pr bump vllm commit to
vllm-project/vllm@6d8246a
2. fix upstream changes vllm-project/vllm#24548
abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable
3. fix metadata_builder changes introduced by
vllm-project/vllm#23693
4. fix `structured_outputs_config` changes introduced by
vllm-project/vllm#22772
5. fix `moe_config` changes introduced by
vllm-project/vllm#22537

Co-authored-by:  MengqingCao <cmq0113@163.com>
Co-authored-by:  Yikun Jiang <yikunkero@gmail.com>

- vLLM version: v0.10.2
- vLLM main:
vllm-project/vllm@c60e613

---------

Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: MengqingCao <cmq0113@163.com>
Signed-off-by: hwhaokun <haokun0405@163.com>
NSDie pushed a commit to NSDie/vllm-ascend that referenced this pull request Nov 24, 2025
…lm-project#2907)

### What this PR does / why we need it?
1. This pr bump vllm commit to
vllm-project/vllm@6d8246a
2. fix upstream changes vllm-project/vllm#24548
abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable
3. fix metadata_builder changes introduced by
vllm-project/vllm#23693
4. fix `structured_outputs_config` changes introduced by
vllm-project/vllm#22772
5. fix `moe_config` changes introduced by
vllm-project/vllm#22537

Co-authored-by:  MengqingCao <cmq0113@163.com>
Co-authored-by:  Yikun Jiang <yikunkero@gmail.com>

- vLLM version: v0.10.2
- vLLM main:
vllm-project/vllm@c60e613

---------

Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Co-authored-by: MengqingCao <cmq0113@163.com>
Signed-off-by: nsdie <yeyifan@huawei.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed tpu Related to Google TPUs v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Renderer]: Consolidate MM classes to MultiModalFeatureSpec

3 participants