Skip to content

Unable to convert Smaug 72B #5807

Closed
Closed
@schmorp

Description

@schmorp

I am unable to convert https://huggingface.co/abacusai/Smaug-72B-v0.1 (and others) to GGUF with either convert.py or convert-hf-to-gguf.py.

With the former, I get:

RuntimeError: Internal: ./src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]

"internal" feels like a bug. When I add --vocab-type hfft (and then --pad-vocab because it tells me to), I get a nonfunctional model:

llm_load_vocab: SPM vocabulary, but newline token not found: unordered_map::at! Using special_pad_id instead.llm_load_vocab: mismatch in special tokens definition ( 421/152064 vs 214/152064 ).

and convert-hf-to-gguf.py does not support "LlamaForCausalLM".

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions