Skip to content

Does vllm support vicuna-13b-v1.5-16k ? #674

Closed
@Extremys

Description

@Extremys

The model seems to loop in its outputs
image

I'm using:
vllm 0.1.2
transformers 4.31.0
launched through the FastChat (commit b0462aa )/vllm wrapper (worker)
Any idea on what's going on? Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions