Skip to content

Commit

Permalink
Demonstrate usage of remote vocabs
Browse files Browse the repository at this point in the history
  • Loading branch information
danforbes committed Jun 1, 2023
1 parent 9574a5a commit 5363a09
Showing 1 changed file with 9 additions and 2 deletions.
11 changes: 9 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -155,16 +155,23 @@ running it. Here's an example that uses the open-source
language model:

```shell
llm gptneox infer -m RedPajama-INCITE-Base-3B-v1-q4_0.bin -p "Rust is a cool programming language because"
llm gptneox infer -m RedPajama-INCITE-Base-3B-v1-q4_0.bin -p "Rust is a cool programming language because" -r togethercomputer/RedPajama-INCITE-Base-3B-v1
```

In the example above, the first two arguments specify the model architecture and
command, respectively. The required `-m` argument specifies the local path to
the model, and the required `-p` argument specifies the evaluation prompt. The
optional `-r` argument is used to load the model's vocabulary from a remote
Hugging Face 🤗 repository, which will typically improve results when compared
to loading the vocabulary from the model file itself; there is also an optional
`-v` argument that can be used to specify the path to a local vocabulary file.
For more information about the `llm` CLI, use the `--help` parameter.

There is also a [simple inference example](./crates/llm/examples/inference.rs)
that is helpful for [debugging](./.vscode/launch.json):

```shell
cargo run --release --example inference gptneox RedPajama-INCITE-Base-3B-v1-q4_0.bin $OPTIONAL_PROMPT
cargo run --release --example inference gptneox RedPajama-INCITE-Base-3B-v1-q4_0.bin -r $OPTIONAL_VOCAB_REPO -p $OPTIONAL_PROMPT
```

## Q&A
Expand Down

0 comments on commit 5363a09

Please sign in to comment.