Closed
Description
How to set up the num_crops for LVLMs? For example, when initializing the processor for Ph-3.5-vision-instruct, the hugging face code looks like the following:
processor = AutoProcessor.from_pretrained(model_id,
trust_remote_code=True,
num_crops=4
)
But I didn't find a way to set num_crops in vllm.
I checked the pull request #7710, but I didn't find the solution.
Activity