Skip to content

0xPlaygrounds/rig

Repository files navigation

Rig logo
      stars - rig
 

 

✨ If you would like to help spread the word about Rig, please consider starring the repo!

Warning

Here be dragons! Rig is alpha software and will contain breaking changes as it evolves. We'll annotate them and highlight migration paths as we encounter them.

What is Rig?

Rig is a Rust library for building scalable, modular, and ergonomic LLM-powered applications.

More information about this crate can be found in the crate documentation.

Help us improve Rig by contributing to our Feedback form.

Table of contents

High-level features

  • Full support for LLM completion and embedding workflows
  • Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, in-memory)
  • Integrate LLMs in your app with minimal boilerplate

Get Started

cargo add rig-core

Simple example:

use rig::{completion::Prompt, providers::openai};

#[tokio::main]
async fn main() {
    // Create OpenAI client and model
    // This requires the `OPENAI_API_KEY` environment variable to be set.
    let openai_client = openai::Client::from_env();

    let gpt4 = openai_client.agent("gpt-4").build();

    // Prompt the model and print its response
    let response = gpt4
        .prompt("Who are you?")
        .await
        .expect("Failed to prompt GPT-4");

    println!("GPT-4: {response}");
}

Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).

You can find more examples each crate's examples (ie. src/examples) directory. More detailed use cases walkthroughs are regularly published on our Dev.to Blog.

Supported Integrations

Model Providers Vector Stores

ChatGPT logo Claude Anthropic logo
Cohere logo Gemini logo
xAI logo perplexity logo

Mongo DB logo Neo4j logo

Lance DB logo

Vector stores are available as separate companion-crates:



Build by Playgrounds