Skip to content

Commit

Permalink
add cover
Browse files Browse the repository at this point in the history
  • Loading branch information
hahnyuan committed Apr 4, 2023
1 parent 5d7a014 commit bd0f3ba
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ Large-scale language models (LLMs) have shown exceptional performance on various
In our [paper](https://arxiv.org/abs/2304.01089), we propose a novel reorder-based quantization approach called RPTQ. The RPTQ approach involves rearranging the channels in the activations and then quantizing them in clusters, thereby reducing the impact of the range difference between channels.
By implementing the RPTQ approach, we achieved a significant breakthrough by pushing LLM models to 3 bit activation for the first time.

![Overview](/ims/cover.png)

### Requirements
python packages
- torch >= 2.0.0
Expand Down
Binary file added ims/cover.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit bd0f3ba

Please sign in to comment.