-
Notifications
You must be signed in to change notification settings - Fork 10.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
convert-llama-ggml-to-gguf.py does not work after #3633 #4631
Comments
You can probably just remove the script at this point, it was originally just intended as a transitional thing and I doubt many people are converting GGML files these days. I took a quick look at it and it doesn't seem like there's a simple fix. The function that returned a |
@cebtenzzre diff --git a/convert-llama-ggml-to-gguf.py b/convert-llama-ggml-to-gguf.py
index e359330..de14534 100755
--- a/convert-llama-ggml-to-gguf.py
+++ b/convert-llama-ggml-to-gguf.py
@@ -371,9 +371,10 @@ def handle_metadata(cfg, hp):
params = convert.Params.loadOriginalParamsJson(fakemodel, orig_config_path)
else:
raise ValueError('Unable to load metadata')
- vocab = convert.load_vocab(
+ vocab = convert.VocabLoader(
+ params,
cfg.vocab_dir if cfg.vocab_dir is not None else cfg.model_metadata_dir,
- cfg.vocabtype)
+ )
|
cc @strutive07 @KerfuffleV2
The text was updated successfully, but these errors were encountered: