diff --git a/README.Rmd b/README.Rmd index 2580e40..7457839 100644 --- a/README.Rmd +++ b/README.Rmd @@ -21,7 +21,7 @@ knitr::opts_chunk$set( [![CRAN_Download_Badge](https://cranlogs.r-pkg.org/badges/grand-total/ollamar)](https://cran.r-project.org/package=ollamar) -The [Ollama R library](https://hauselin.github.io/ollama-r/) is the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine. Main site: https://hauselin.github.io/ollama-r/ +The [Ollama R library](https://hauselin.github.io/ollama-r/) is the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine. The library also makes it easy to work with data structures (e.g., conversational/chat histories) that are standard for different LLMs (such as those provided by OpenAI and Anthropic). It also lets you specify different output formats (e.g., dataframes, text/vector, lists) that best suit your need, allowing easy integration with other libraries/tools and parallelization via the `httr2` library. @@ -63,7 +63,7 @@ remotes::install_github("hauselin/ollamar") ## Example usage -Below is a basic demonstration of how to use the library. For details, see the [getting started vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html). +Below is a basic demonstration of how to use the library. For details, see the [getting started vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html) on our [main page](https://hauselin.github.io/ollama-r/). `ollamar` uses the [`httr2` library](https://httr2.r-lib.org/index.html) to make HTTP requests to the Ollama server, so many functions in this library returns an `httr2_response` object by default. If the response object says `Status: 200 OK`, then the request was successful. diff --git a/README.md b/README.md index 3f34990..7d1493a 100644 --- a/README.md +++ b/README.md @@ -13,8 +13,7 @@ status](https://www.r-pkg.org/badges/version/ollamar)](https://CRAN.R-project.or The [Ollama R library](https://hauselin.github.io/ollama-r/) is the easiest way to integrate R with [Ollama](https://ollama.com/), which -lets you run language models locally on your own machine. Main site: - +lets you run language models locally on your own machine. The library also makes it easy to work with data structures (e.g., conversational/chat histories) that are standard for different LLMs @@ -85,7 +84,8 @@ remotes::install_github("hauselin/ollamar") Below is a basic demonstration of how to use the library. For details, see the [getting started -vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html). +vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html) on +our [main page](https://hauselin.github.io/ollama-r/). `ollamar` uses the [`httr2` library](https://httr2.r-lib.org/index.html) to make HTTP requests to the Ollama server, so many functions in this