You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The bug
Loading and prompting the transformer model openbmb/MiniCPM-Llama3-V-2_5 does not work.
It tries to load the model (but according to nvtop nothing is allocated on my gpu). No error is thrown. Trying to prompt the LLM stops immediately without a response and without an error.
Worth to mention, that openbmb provided a test script for transformers, that does work
# test.pyimporttorchfromPILimportImagefromtransformersimportAutoModel, AutoTokenizermodel=AutoModel.from_pretrained('openbmb/MiniCPM-Llama3-V-2_5', trust_remote_code=True, torch_dtype=torch.float16)
model=model.to(device='cuda')
tokenizer=AutoTokenizer.from_pretrained('openbmb/MiniCPM-Llama3-V-2_5', trust_remote_code=True)
model.eval()
image=Image.open('xx.jpg').convert('RGB')
question='What is in the image?'msgs= [{'role': 'user', 'content': question}]
res=model.chat(
image=image,
msgs=msgs,
tokenizer=tokenizer,
sampling=True, # if sampling=False, beam_search will be used by defaulttemperature=0.7,
# system_prompt='' # pass system_prompt if needed
)
print(res)
The text was updated successfully, but these errors were encountered:
I actually do get an error on my machine during a forward pass: TypeError: MiniCPMV.forward() missing 1 required positional argument: 'data' (can include full traceback if helpful)
It seems that this model departs from the standard huggingface model-call API that we're using (likely because of multimodality).
The bug
Loading and prompting the transformer model
openbmb/MiniCPM-Llama3-V-2_5
does not work.It tries to load the model (but according to nvtop nothing is allocated on my gpu). No error is thrown. Trying to prompt the LLM stops immediately without a response and without an error.
To Reproduce
Worth to mention, that openbmb provided a test script for transformers, that does work
The text was updated successfully, but these errors were encountered: