Skip to content

Commit

Permalink
Rerender readme
Browse files Browse the repository at this point in the history
  • Loading branch information
hauselin committed Sep 10, 2024
1 parent 4fca9c0 commit 4bf2620
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 5 deletions.
4 changes: 2 additions & 2 deletions README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ knitr::opts_chunk$set(
[![CRAN_Download_Badge](https://cranlogs.r-pkg.org/badges/grand-total/ollamar)](https://cran.r-project.org/package=ollamar)
<!-- badges: end -->

The [Ollama R library](https://hauselin.github.io/ollama-r/) is the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine. Main site: https://hauselin.github.io/ollama-r/
The [Ollama R library](https://hauselin.github.io/ollama-r/) is the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine.

The library also makes it easy to work with data structures (e.g., conversational/chat histories) that are standard for different LLMs (such as those provided by OpenAI and Anthropic). It also lets you specify different output formats (e.g., dataframes, text/vector, lists) that best suit your need, allowing easy integration with other libraries/tools and parallelization via the `httr2` library.

Expand Down Expand Up @@ -63,7 +63,7 @@ remotes::install_github("hauselin/ollamar")

## Example usage

Below is a basic demonstration of how to use the library. For details, see the [getting started vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html).
Below is a basic demonstration of how to use the library. For details, see the [getting started vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html) on our [main page](https://hauselin.github.io/ollama-r/).

`ollamar` uses the [`httr2` library](https://httr2.r-lib.org/index.html) to make HTTP requests to the Ollama server, so many functions in this library returns an `httr2_response` object by default. If the response object says `Status: 200 OK`, then the request was successful.

Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,7 @@ status](https://www.r-pkg.org/badges/version/ollamar)](https://CRAN.R-project.or

The [Ollama R library](https://hauselin.github.io/ollama-r/) is the
easiest way to integrate R with [Ollama](https://ollama.com/), which
lets you run language models locally on your own machine. Main site:
<https://hauselin.github.io/ollama-r/>
lets you run language models locally on your own machine.

The library also makes it easy to work with data structures (e.g.,
conversational/chat histories) that are standard for different LLMs
Expand Down Expand Up @@ -85,7 +84,8 @@ remotes::install_github("hauselin/ollamar")

Below is a basic demonstration of how to use the library. For details,
see the [getting started
vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html).
vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html) on
our [main page](https://hauselin.github.io/ollama-r/).

`ollamar` uses the [`httr2` library](https://httr2.r-lib.org/index.html)
to make HTTP requests to the Ollama server, so many functions in this
Expand Down

0 comments on commit 4bf2620

Please sign in to comment.