Skip to content

Commit

Permalink
fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
leo-gan committed Aug 14, 2024
1 parent 650b1ea commit e696328
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/docs/integrations/llms/rwkv.mdx
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# RWKV-4
# RWKV

>[RWKV](https://www.rwkv.com/) (pronounced RwaKuv) language model is an RNN
> with GPT-level LLM performance,
> and it can also be directly trained like a GPT transformer (parallelizable).
>
>It's combining the best of RNN and transformer - great performance, fast inference,
> fast training, saves VRAM, "infinite" ctxlen, and free text embedding.
> Moreover it's 100% attention-free, and a LFAI project.
> Moreover, it's 100% attention-free, and a LFAI project.

## Installation and Setup
Expand All @@ -20,7 +20,7 @@ pip install rwkv tokenizer
- Download a [RWKV model](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) and place it in your desired directory
- Download a [tokens file](https://raw.githubusercontent.com/BlinkDL/ChatRWKV/main/20B_tokenizer.json)

### Rwkv-4 models recommended VRAM
### Rwkv models recommended VRAM

| Model | 8bit | bf16/fp16 | fp32 |
|-------|------|-----------|------|
Expand Down

0 comments on commit e696328

Please sign in to comment.