Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "edgee"
version = "2.0.0"
version = "2.0.1"
edition = "2021"
authors = ["Edgee <opensource@edgee.cloud>"]
license = "Apache-2.0"
Expand Down
304 changes: 41 additions & 263 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,11 @@
# Edgee Rust SDK

A modern, idiomatic Rust SDK for the [Edgee AI Gateway](https://www.edgee.cloud).
Modern, type-safe Rust SDK for the [Edgee AI Gateway](https://www.edgee.cloud).

[![Crates.io](https://img.shields.io/crates/v/edgee.svg)](https://crates.io/crates/edgee)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](LICENSE)
[![Rust](https://img.shields.io/badge/rust-1.75%2B-orange.svg)](https://www.rust-lang.org)

## Features

- **🦀 Idiomatic Rust** - Leverages Rust's type system, ownership, and error handling
- **⚡ Async/Await** - Built on tokio for efficient async operations
- **🔒 Type-Safe** - Strong typing with enums, structs, and comprehensive error types
- **📡 Streaming** - First-class support for streaming responses with `Stream` trait
- **🛠️ Tool Calling** - Full support for function/tool calling
- **🎯 Flexible Input** - Accept strings, message arrays, or structured objects
- **🚀 Zero-Cost Abstractions** - Efficient implementation with minimal overhead
- **📦 Minimal Dependencies** - Only essential, well-maintained dependencies

## Installation

Add this to your `Cargo.toml`:
Expand All @@ -33,300 +23,88 @@ use edgee::Edgee;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client from environment variables (EDGEE_API_KEY)
let client = Edgee::from_env()?;

// Simple text completion
let response = client.send("gpt-4o", "Hello, world!").await?;
let response = client.send("gpt-4o", "What is the capital of France?").await?;
println!("{}", response.text().unwrap_or(""));
// "The capital of France is Paris."

Ok(())
}
```

## Configuration

The SDK supports multiple ways to configure the client:

### From Environment Variables

Set `EDGEE_API_KEY` (required) and optionally `EDGEE_BASE_URL`:

```rust
let client = Edgee::from_env()?;
```

### With API Key

```rust
let client = Edgee::with_api_key("your-api-key");
```

### With Custom Configuration

```rust
use edgee::EdgeeConfig;

let config = EdgeeConfig::new("your-api-key")
.with_base_url("https://custom.api.url");

let client = Edgee::new(config);
```

## Usage Examples
## Send Method

### Simple Text Completion
The `send()` method makes non-streaming chat completion requests:

```rust
use edgee::Edgee;
let response = client.send("gpt-4o", "Hello, world!").await?;

let client = Edgee::from_env()?;
let response = client.send("gpt-4o", "Explain Rust in one sentence").await?;
println!("{}", response.text().unwrap_or(""));
// Access response
println!("{}", response.text().unwrap_or("")); // Text content
println!("{:?}", response.finish_reason()); // Finish reason
if let Some(tool_calls) = response.tool_calls() { // Tool calls (if any)
println!("{:?}", tool_calls);
}
```

### Multi-turn Conversation

```rust
use edgee::{Edgee, Message};

let client = Edgee::from_env()?;

let messages = vec![
Message::system("You are a helpful assistant."),
Message::user("What's the capital of France?"),
];

let response = client.send("gpt-4o", messages).await?;
println!("{}", response.text().unwrap_or(""));
```
## Stream Method

### Streaming Responses
The `stream()` method enables real-time streaming responses:

```rust
use edgee::Edgee;
use tokio_stream::StreamExt;

let client = Edgee::from_env()?;
let mut stream = client.stream("gpt-4o", "Tell me a story").await?;

while let Some(chunk) = stream.next().await {
if let Ok(chunk) = chunk {
if let Some(text) = chunk.text() {
print!("{}", text);
while let Some(result) = stream.next().await {
match result {
Ok(chunk) => {
if let Some(text) = chunk.text() {
print!("{}", text);
}

if let Some(reason) = chunk.finish_reason() {
println!("\nFinished: {}", reason);
}
}
Err(e) => eprintln!("Error: {}", e),
}
}
```

### Tool/Function Calling

```rust
use edgee::{Edgee, Message, InputObject, Tool, FunctionDefinition, JsonSchema};
use std::collections::HashMap;

let client = Edgee::from_env()?;

// Define a function
let function = FunctionDefinition {
name: "get_weather".to_string(),
description: Some("Get the weather for a location".to_string()),
parameters: JsonSchema {
schema_type: "object".to_string(),
properties: Some({
let mut props = HashMap::new();
props.insert("location".to_string(), serde_json::json!({
"type": "string",
"description": "The city and state"
}));
props
}),
required: Some(vec!["location".to_string()]),
description: None,
},
};

// Send request with tools
let input = InputObject::new(vec![
Message::user("What's the weather in Tokyo?")
])
.with_tools(vec![Tool::function(function)]);

let response = client.send("gpt-4o", input).await?;

// Handle tool calls
if let Some(tool_calls) = response.tool_calls() {
for call in tool_calls {
println!("Function: {}", call.function.name);
println!("Arguments: {}", call.function.arguments);
}
}
```

## API Reference

### Client

#### `Edgee::new(config: EdgeeConfig) -> Self`

Create a new client with the given configuration.

#### `Edgee::from_env() -> Result<Self>`

Create a client from environment variables (`EDGEE_API_KEY`, `EDGEE_BASE_URL`).

#### `Edgee::with_api_key(api_key: impl Into<String>) -> Self`

Create a client with just an API key (uses default base URL).

#### `Edgee::send(model: impl Into<String>, input: impl Into<Input>) -> Result<SendResponse>`

Send a non-streaming chat completion request.

- **model**: Model identifier (e.g., "gpt-4o", "mistral-large-latest")
- **input**: Can be a `&str`, `String`, `Vec<Message>`, or `InputObject`

#### `Edgee::stream(model: impl Into<String>, input: impl Into<Input>) -> Result<impl Stream<Item = Result<StreamChunk>>>`

Send a streaming chat completion request.

Returns a `Stream` of `StreamChunk` items that can be processed as they arrive.

### Data Models

#### `Message`

Represents a message in the conversation.

**Constructors:**
- `Message::system(content)` - System message
- `Message::user(content)` - User message
- `Message::assistant(content)` - Assistant message
- `Message::tool(tool_call_id, content)` - Tool response message

#### `InputObject`

Structured input for chat completions.

```rust
let input = InputObject::new(messages)
.with_tools(tools)
.with_tool_choice(choice);
```

#### `SendResponse`

Response from a non-streaming request.

**Convenience methods:**
- `text()` - Get text from the first choice
- `message()` - Get the message from the first choice
- `finish_reason()` - Get the finish reason
- `tool_calls()` - Get tool calls from the first choice

#### `StreamChunk`
## Features

Chunk from a streaming response.
- ✅ **Type-safe** - Leverages Rust's powerful type system
- ✅ **Async/await** - Built on tokio for efficient async operations
- ✅ **OpenAI-compatible** - Works with any model supported by Edgee
- ✅ **Streaming** - First-class support with `Stream` trait
- ✅ **Tool calling** - Full support for function calling
- ✅ **Zero-cost abstractions** - Efficient implementation with minimal overhead

**Convenience methods:**
- `text()` - Get text delta from the first choice
- `role()` - Get the role from the first choice
- `finish_reason()` - Get the finish reason
## Documentation

### Error Handling
For complete documentation, examples, and API reference, visit:

The SDK uses a custom `Error` enum with `thiserror`:
**👉 [Official Rust SDK Documentation](https://www.edgee.cloud/docs/sdk/rust)**

```rust
use edgee::{Edgee, Error};

match client.send("gpt-4o", "Hello").await {
Ok(response) => println!("{}", response.text().unwrap_or("")),
Err(Error::Api { status, message }) => {
eprintln!("API error {}: {}", status, message);
}
Err(Error::MissingApiKey) => {
eprintln!("API key not found");
}
Err(e) => {
eprintln!("Error: {}", e);
}
}
```

## Supported Models

The SDK works with any model supported by the Edgee AI Gateway, including:

- OpenAI: `gpt-4o`, `gpt-4-turbo`, `gpt-3.5-turbo`
- Anthropic: `claude-3-5-sonnet-20241022`, `claude-3-opus-20240229`
- Mistral: `mistral-large-latest`, `mistral-medium-latest`
- And more...
The documentation includes:
- [Configuration guide](https://www.edgee.cloud/docs/sdk/rust/configuration) - Multiple ways to configure the SDK
- [Send method](https://www.edgee.cloud/docs/sdk/rust/send) - Complete guide to non-streaming requests
- [Stream method](https://www.edgee.cloud/docs/sdk/rust/stream) - Streaming responses guide
- [Tools](https://www.edgee.cloud/docs/sdk/rust/tools) - Function calling guide

## Examples

Run the examples to see the SDK in action:

```bash
# Set your API key
export EDGEE_API_KEY="your-api-key"

# Simple example
cargo run --example simple

# Streaming example
cargo run --example streaming

# Tool calling example
cargo run --example tools
```

## Comparison with Python SDK

This Rust SDK provides similar functionality to the Python SDK with Rust-specific improvements:

| Feature | Python SDK | Rust SDK |
|---------|-----------|----------|
| API | Synchronous | Async/await |
| Type Safety | Runtime (dataclasses) | Compile-time (strong types) |
| Error Handling | Exceptions | `Result<T, E>` |
| Streaming | Generator | `Stream` trait |
| Input Flexibility | ✅ | ✅ (with `Into` traits) |
| Tool Calling | ✅ | ✅ |
| Dependencies | Zero (stdlib only) | Minimal (tokio, reqwest, serde) |
| Performance | Good | Excellent (zero-cost abstractions) |

## Rust Idioms Used

This SDK follows Rust best practices:

- **Strong Typing**: Uses enums for roles, structs for messages
- **Builder Pattern**: `EdgeeConfig::new().with_base_url()`
- **Into Traits**: Flexible input with `impl Into<Input>`
- **Error Handling**: `Result<T, E>` with `thiserror`
- **Async/Await**: Non-blocking I/O with tokio
- **Stream Trait**: Idiomatic streaming with `futures::Stream`
- **Option Types**: `Option<T>` for optional fields
- **Zero-Copy**: Efficient string handling with references

## Testing

Run the test suite:

```bash
cargo test
```

## Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

## License

Licensed under the Apache License, Version 2.0. See [LICENSE](LICENSE) for details.

## Links

- [Edgee AI Gateway](https://www.edgee.cloud)
- [Documentation](https://docs.rs/edgee)
- [GitHub Repositories](https://github.com/edgee-cloud)
Loading