Skip to content

Commit

Permalink
Update doc
Browse files Browse the repository at this point in the history
  • Loading branch information
hauselin committed Jul 29, 2024
1 parent 7feca2a commit 6180026
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 7 deletions.
8 changes: 7 additions & 1 deletion README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,13 @@ The [Ollama R library](https://hauselin.github.io/ollama-r/) provides the easies

> Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.
See [Ollama's Github page](https://github.com/ollama/ollama) for more information. See also the [Ollama API documentation and endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md). For Ollama Python, see [ollama-python](https://github.com/ollama/ollama-python). You'll need to have the [Ollama](https://ollama.com/) app installed on your computer to use this library.
You'll need to have the [Ollama](https://ollama.com/) app installed on your computer to use this library.

See [Ollama's Github page](https://github.com/ollama/ollama) for more information. See also the [Ollama API documentation and endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md).

## Ollama R versus Ollama Python

This library has been inspired by the [Ollama Python library](https://github.com/ollama/ollama-python), so if you're coming from Python, you should feel right at home. Alternatively, if you plan to use Ollama with Python, using this R library will help you understand the Python library as well.

## Installation

Expand Down
17 changes: 12 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,20 @@ lets you run language models locally on your own machine. Main site:
> Note: You should have at least 8 GB of RAM available to run the 7B
> models, 16 GB to run the 13B models, and 32 GB to run the 33B models.
You’ll need to have the [Ollama](https://ollama.com/) app installed on
your computer to use this library.

See [Ollama’s Github page](https://github.com/ollama/ollama) for more
information. See also the [Ollama API documentation and
endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md). For
Ollama Python, see
[ollama-python](https://github.com/ollama/ollama-python). You’ll need to
have the [Ollama](https://ollama.com/) app installed on your computer to
use this library.
endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md).

## Ollama R versus Ollama Python

This library has been inspired by the [Ollama Python
library](https://github.com/ollama/ollama-python), so if you’re coming
from Python, you should feel right at home. Alternatively, if you plan
to use Ollama with Python, using this R library will help you understand
the Python library as well.

## Installation

Expand Down
2 changes: 1 addition & 1 deletion _pkgdown.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ template:
bootstrap: 5

reference:
- title: Official Ollama API calls
- title: Official Ollama API endpoints
desc: Functions to make calls to the Ollama server/API.
contents:
- generate
Expand Down

0 comments on commit 6180026

Please sign in to comment.