Skip to content

Commit

Permalink
Fix missing link (#386)
Browse files Browse the repository at this point in the history
First of all, kudos for this project.
It's the only project I found that properly supports modern models like llama-3.1 out of the box.
Also the speed, and other factors, seem better.

Fixing in this PR a small bug - the previous link led to a missing page.
  • Loading branch information
mrT23 authored Sep 15, 2024
1 parent 06b5c4e commit 365955a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ model = GPTQModel.from_quantized(quant_output_dir)
print(tokenizer.decode(model.generate(**tokenizer("gptqmodel is", return_tensors="pt").to(model.device))[0]))
```

For more advanced features of model quantization, please reference to [this script](examples/quantization/quant_with_alpaca.py)
For more advanced features of model quantization, please reference to [this script](https://github.com/ModelCloud/GPTQModel/blob/main/examples/quantization/basic_usage_wikitext2.py)

### How to Add Support for a New Model

Expand Down

0 comments on commit 365955a

Please sign in to comment.