Skip to content

GPTQ / Quantization support? #174

Closed
Closed
@nikshepsvn

Description

@nikshepsvn

Will vLLM support 4-bit GPTQ models?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions