📑 Docs • 🌐 Website • 🤝 Contribute • ✍🏽 Blogs
✨ If you would like to help spread the word about Rig, please consider starring the repo!
Warning
Here be dragons! As we plan to ship a torrent of features in the following months, future updates will contain breaking changes. With Rig evolving, we'll annotate changes and highlight migration paths as we encounter them.
- Table of contents
- What is Rig?
- High-level features
- Who's using Rig in production?
- Get Started
- Integrations
Rig is a Rust library for building scalable, modular, and ergonomic LLM-powered applications.
More information about this crate can be found in the official & crate (API Reference) documentations.
- Agentic workflows that can handle multi-turn streaming and prompting
- Full GenAI Semantic Convention compatibility
- 20+ model providers, all under one singular unified interface
- 10+ vector store integrations, all under one singular unified interface
- Full support for LLM completion and embedding workflows
- Support for transcription, audio generation and image generation model capabilities
- Integrate LLMs in your app with minimal boilerplate
- Full WASM compatibility (core library only)
Below is a non-exhaustive list of companies and people who are using Rig:
- St Jude - Using Rig for a chatbot utility as part of
proteinpaint, a genomics visualisation tool. - Coral Protocol - Using Rig extensively, both internally as well as part of the Coral Rust SDK.
- VT Code - VT Code is a Rust-based terminal coding agent with semantic code intelligence via Tree-sitter and ast-grep. VT Code uses
rigfor simplifying LLM calls and implement model picker. - Dria - a decentralised AI network. Currently using Rig as part of their compute node.
- Nethermind - Using Rig as part of their Neural Interconnected Nodes Engine framework.
- Neon - Using Rig for their app.build V2 reboot in Rust.
- Listen - A framework aiming to become the go-to framework for AI portfolio management agents. Powers the Listen app.
- Cairnify - helps users find documents, links, and information instantly through an intelligent search bar. Rig provides the agentic foundation behind Cairnify’s AI search experience, enabling tool-calling, reasoning, and retrieval workflows.
- Ryzome - Ryzome is a visual AI workspace that lets you build interconnected canvases of thoughts, research, and AI agents to orchestrate complex knowledge work.
Are you also using Rig in production? Open an issue to have your name added!
cargo add rig-coreuse rig::{client::CompletionClient, completion::Prompt, providers::openai};
#[tokio::main]
async fn main() {
// Create OpenAI client and model
// This requires the `OPENAI_API_KEY` environment variable to be set.
let openai_client = openai::Client::from_env();
let gpt4 = openai_client.agent("gpt-4").build();
// Prompt the model and print its response
let response = gpt4
.prompt("Who are you?")
.await
.expect("Failed to prompt GPT-4");
println!("GPT-4: {response}");
}Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features
or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).
You can find more examples each crate's examples (ie. rig-core/examples) directory. More detailed use cases walkthroughs are regularly published on our Dev.to Blog and added to Rig's official documentation (docs.rig.rs).
Vector stores are available as separate companion-crates:
- MongoDB:
rig-mongodb - LanceDB:
rig-lancedb - Neo4j:
rig-neo4j - Qdrant:
rig-qdrant - SQLite:
rig-sqlite - SurrealDB:
rig-surrealdb - Milvus:
rig-milvus - ScyllaDB:
rig-scylladb - AWS S3Vectors:
rig-s3vectors - HelixDB:
rig-helixdb
The following providers are available as separate companion-crates:
- AWS Bedrock:
rig-bedrock - Fastembed:
rig-fastembed - Eternal AI:
rig-eternalai - Google Vertex:
rig-vertexai
We also have some other associated crates that have additional functionality you may find helpful when using Rig:
rig-onchain-kit- the Rig Onchain Kit. Intended to make interactions between Solana/EVM and Rig much easier to implement.