Skip to content

Infra-layer Orchestrator for AI Agents

License

Notifications You must be signed in to change notification settings

sivaragavan/tsbx

Repository files navigation

TSBX logo

TSBX

Overview

TSBX orchestrates long-lived, Docker-backed sandboxes for agent workflows. It bundles a Rust service stack, a Node.js CLI, and an Operator UI so teams can provision, monitor, and control persistent workspaces connected to an OpenAI-compatible inference endpoint.

Requirements

  • Linux host with Docker 20.10+
  • Node.js 18+ and npm (for the CLI)
  • Rust 1.82+ (only if you plan to build the Rust services locally)
  • Host + inference providers/models defined in ~/.tsbx/tsbx.json (copy config/tsbx.sample.json)

Quick Setup

  1. Install or link the CLI

    npm install -g ./cli        # from this repo
    # or
    npm install -g @tsbx/cli
    # for local changes
    ./scripts/link.sh
  2. Configure host + inference providers

    • Copy config/tsbx.sample.json to ~/.tsbx/tsbx.json (or supply --config <path> when running tsbx start).
    • Fill in the host block plus the inference section’s providers array. Each provider just needs a url and models. The first provider in the list becomes the default, and each provider’s first model is used unless you override it per sandbox.
    • Individual sandboxes can supply their own inference API key during creation; NL tasks remain disabled for sandboxes that launch without a key.
  3. Start the core services

    tsbx start

    Pass component names (e.g., tsbx start api controller) if you want to launch a subset.

  4. Visit the Operator UI
    Open the host.url defined in your tsbx.json (defaults to http://localhost) to browse sandboxes, launch tasks, and monitor activity. The REST API is available at <host>/api.

If something misbehaves, run tsbx doctor or tsbx fix from the CLI for guided troubleshooting.

Configuration File

TSBX reads all branding + inference metadata from a single JSON file (default ~/.tsbx/tsbx.json, override with tsbx start --config <path>). Use config/tsbx.sample.json as a starting point:

{
  "host": {
    "name": "TSBX",
    "url": "http://localhost"
  },
  "inference": {
    "providers": [
      {
        "name": "Positron",
        "url": "https://api.positron.ai/v1/chat/completions",
        "models": [
          { "name": "llama-3.2-3b-instruct-fast-tp2" },
          { "name": "llama-3.1-8b-instruct-good-tp2" }
        ]
      }
    ]
  }
}
  • host.name / host.url drive Operator UI branding and task links.
  • Each provider must define at least one model. The first entry is treated as that provider’s default model.
  • The Controller injects provider URL/model into each sandbox; Operator UI fetches the list via GET /api/v0/inference/providers.

About

Infra-layer Orchestrator for AI Agents

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •