TSBX orchestrates long-lived, Docker-backed sandboxes for agent workflows. It bundles a Rust service stack, a Node.js CLI, and an Operator UI so teams can provision, monitor, and control persistent workspaces connected to an OpenAI-compatible inference endpoint.
- Linux host with Docker 20.10+
- Node.js 18+ and npm (for the CLI)
- Rust 1.82+ (only if you plan to build the Rust services locally)
- Host + inference providers/models defined in
~/.tsbx/tsbx.json(copyconfig/tsbx.sample.json)
-
Install or link the CLI
npm install -g ./cli # from this repo # or npm install -g @tsbx/cli # for local changes ./scripts/link.sh
-
Configure host + inference providers
- Copy
config/tsbx.sample.jsonto~/.tsbx/tsbx.json(or supply--config <path>when runningtsbx start). - Fill in the
hostblock plus theinferencesection’sprovidersarray. Each provider just needs aurlandmodels. The first provider in the list becomes the default, and each provider’s first model is used unless you override it per sandbox. - Individual sandboxes can supply their own inference API key during creation; NL tasks remain disabled for sandboxes that launch without a key.
- Copy
-
Start the core services
tsbx start
Pass component names (e.g.,
tsbx start api controller) if you want to launch a subset. -
Visit the Operator UI
Open thehost.urldefined in yourtsbx.json(defaults to http://localhost) to browse sandboxes, launch tasks, and monitor activity. The REST API is available at<host>/api.
If something misbehaves, run
tsbx doctorortsbx fixfrom the CLI for guided troubleshooting.
TSBX reads all branding + inference metadata from a single JSON file (default ~/.tsbx/tsbx.json, override with tsbx start --config <path>). Use config/tsbx.sample.json as a starting point:
{
"host": {
"name": "TSBX",
"url": "http://localhost"
},
"inference": {
"providers": [
{
"name": "Positron",
"url": "https://api.positron.ai/v1/chat/completions",
"models": [
{ "name": "llama-3.2-3b-instruct-fast-tp2" },
{ "name": "llama-3.1-8b-instruct-good-tp2" }
]
}
]
}
}host.name/host.urldrive Operator UI branding and task links.- Each provider must define at least one model. The first entry is treated as that provider’s default model.
- The Controller injects provider URL/model into each sandbox; Operator UI fetches the list via
GET /api/v0/inference/providers.
