From 6180026e8e8c73015ab94212989a69522a66c8b7 Mon Sep 17 00:00:00 2001 From: Hause Lin Date: Sun, 28 Jul 2024 23:29:09 -0400 Subject: [PATCH] Update doc --- README.Rmd | 8 +++++++- README.md | 17 ++++++++++++----- _pkgdown.yml | 2 +- 3 files changed, 20 insertions(+), 7 deletions(-) diff --git a/README.Rmd b/README.Rmd index 2ecf319..e2569fb 100644 --- a/README.Rmd +++ b/README.Rmd @@ -23,7 +23,13 @@ The [Ollama R library](https://hauselin.github.io/ollama-r/) provides the easies > Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models. -See [Ollama's Github page](https://github.com/ollama/ollama) for more information. See also the [Ollama API documentation and endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md). For Ollama Python, see [ollama-python](https://github.com/ollama/ollama-python). You'll need to have the [Ollama](https://ollama.com/) app installed on your computer to use this library. +You'll need to have the [Ollama](https://ollama.com/) app installed on your computer to use this library. + +See [Ollama's Github page](https://github.com/ollama/ollama) for more information. See also the [Ollama API documentation and endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md). + +## Ollama R versus Ollama Python + +This library has been inspired by the [Ollama Python library](https://github.com/ollama/ollama-python), so if you're coming from Python, you should feel right at home. Alternatively, if you plan to use Ollama with Python, using this R library will help you understand the Python library as well. ## Installation diff --git a/README.md b/README.md index 2f7f1ac..11768e4 100644 --- a/README.md +++ b/README.md @@ -16,13 +16,20 @@ lets you run language models locally on your own machine. Main site: > Note: You should have at least 8 GB of RAM available to run the 7B > models, 16 GB to run the 13B models, and 32 GB to run the 33B models. +You’ll need to have the [Ollama](https://ollama.com/) app installed on +your computer to use this library. + See [Ollama’s Github page](https://github.com/ollama/ollama) for more information. See also the [Ollama API documentation and -endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md). For -Ollama Python, see -[ollama-python](https://github.com/ollama/ollama-python). You’ll need to -have the [Ollama](https://ollama.com/) app installed on your computer to -use this library. +endpoints](https://github.com/ollama/ollama/blob/main/docs/api.md). + +## Ollama R versus Ollama Python + +This library has been inspired by the [Ollama Python +library](https://github.com/ollama/ollama-python), so if you’re coming +from Python, you should feel right at home. Alternatively, if you plan +to use Ollama with Python, using this R library will help you understand +the Python library as well. ## Installation diff --git a/_pkgdown.yml b/_pkgdown.yml index d98396b..8a96e56 100644 --- a/_pkgdown.yml +++ b/_pkgdown.yml @@ -4,7 +4,7 @@ template: bootstrap: 5 reference: - - title: Official Ollama API calls + - title: Official Ollama API endpoints desc: Functions to make calls to the Ollama server/API. contents: - generate