Skip to content

WebSocket bridge for CLI AI agents — stream Claude and Codex responses over a persistent connection

License

Notifications You must be signed in to change notification settings

vxcozy/bridge-ws

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

bridge-ws

WebSocket bridge for CLI AI agents — stream Claude, Codex, and Ollama responses over a persistent connection.

Install

npm install -g bridge-ws

Or run without installing:

npx bridge-ws

Quick start

1. Start the server:

bridge-ws
╔═══════════════════════════════════════╗
║          bridge-ws v2.0.2            ║
║     CLI AI Agent Bridge              ║
╚═══════════════════════════════════════╝

Found Claude CLI: 1.x.x
bridge-ws running on ws://localhost:9999
Health check: http://localhost:9999/healthz
Press Ctrl+C to stop

2. Connect and send a prompt:

import WebSocket from "ws";

const ws = new WebSocket("ws://localhost:9999");

ws.on("message", (data) => {
  const msg = JSON.parse(data.toString());

  if (msg.type === "connected") {
    ws.send(JSON.stringify({
      type: "prompt",
      prompt: "Say hello in one sentence.",
      requestId: "req-1",
    }));
  }

  if (msg.type === "chunk") process.stdout.write(msg.content);
  if (msg.type === "complete") ws.close();
  if (msg.type === "error") { console.error(msg.message); ws.close(); }
});

Requirements

  • Node.js ≥ 20
  • Claude Code CLI (npm install -g @anthropic-ai/claude-code)
  • Codex CLI (optional, for provider: "codex")
  • Ollama (optional, for provider: "ollama" — no API key needed)

Features

  • Request multiplexing — send multiple prompts on a single connection; responses are tagged by requestId
  • Streaming — response chunks arrive as they are produced, not after completion
  • Cancellation — cancel any in-flight request by requestId
  • Three providers — Claude (claude), Codex (codex), and Ollama (ollama) on the same server
  • Optional auth — Bearer token authentication via BRIDGE_WS_API_KEY
  • Health checkGET /healthz on the same port
  • Library-ready — import AgentWebSocketServer directly and inject custom runners

CLI options

Flag Default Description
-p, --port <port> 9999 Listen port
-H, --host <host> localhost Bind address
-c, --claude-path <path> claude Path to Claude CLI
--codex-path <path> codex Path to Codex CLI
--ollama-url <url> http://localhost:11434 Ollama base URL
-t, --timeout <seconds> 300 Per-request CLI timeout
--log-level <level> info debug, info, warn, error
--origins <origins> (any) Comma-separated allowed origins
--max-turns <n> (unlimited) Max Claude agentic turns
--tools <tools> (all) Comma-separated Claude tools to enable

Environment variables:

Variable Description
BRIDGE_WS_API_KEY Required Bearer token for connecting clients

Protocol

Client → Server

// Send a prompt (provider: "claude" | "codex" | "ollama", defaults to "claude")
{ "type": "prompt", "prompt": "...", "requestId": "req-1", "provider": "claude" }

// Cancel a request
{ "type": "cancel", "requestId": "req-1" }

Server → Client

// On connect
{ "type": "connected", "version": "2.0", "agent": "bridge-ws" }

// Response fragments
{ "type": "chunk", "content": "...", "requestId": "req-1" }

// On completion
{ "type": "complete", "requestId": "req-1" }

// On error or cancellation
{ "type": "error", "message": "...", "requestId": "req-1" }

Full protocol details: docs/reference.md

Concurrent requests

// Both requests run concurrently on the same connection
ws.send(JSON.stringify({ type: "prompt", prompt: "First task", requestId: "a" }));
ws.send(JSON.stringify({ type: "prompt", prompt: "Second task", requestId: "b" }));

ws.on("message", (data) => {
  const msg = JSON.parse(data.toString());
  if (msg.type === "chunk") {
    console.log(`[${msg.requestId}] ${msg.content}`);
  }
});

Authentication

BRIDGE_WS_API_KEY=my-secret bridge-ws
const ws = new WebSocket("ws://localhost:9999", {
  headers: { Authorization: "Bearer my-secret" },
});

Library usage

import { AgentWebSocketServer } from "bridge-ws";
import pino from "pino";

const server = new AgentWebSocketServer({
  port: 9999,
  host: "127.0.0.1",
  logger: pino(),
  apiKey: process.env.BRIDGE_WS_API_KEY,
  maxTurns: 5,
});

await server.start();

Documentation

  • Tutorial — step-by-step first run
  • How-to guides — auth, tools, cancellation, reverse proxy, library usage
  • Reference — all CLI flags, message types, HTTP endpoints
  • Explanation — design decisions and architecture

License

MIT

About

WebSocket bridge for CLI AI agents — stream Claude and Codex responses over a persistent connection

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •