Web clients for ayo - available in both connected and offline modes.
| Mode | Description | Link |
|---|---|---|
| Connected | Connect to a running ayo server | Open Connected Client |
| Offline | Run entirely in browser (no server needed) | Open Offline Client |
The connected client connects to a running ayo server for full agent capabilities.
-
Start the ayo server on your machine:
ayo serve --port 8080
-
Open the connected web client
-
Connect using one of these methods:
- Scan QR Code: Point your camera at the QR code displayed in your terminal
- Manual Entry: Enter the server URL and token shown in the terminal output
-
Select an agent and start chatting
- Single HTML file (no build step required)
- Dark/light theme based on system preference
- QR code scanning for easy pairing
- SSE streaming for real-time responses
- Session management with history
- Tool call visualization
- Reasoning block display (collapsible)
- Connection persistence via localStorage
The offline client runs entirely in your browser with no server required.
-
Open the offline web client
-
Configure an LLM provider in Settings:
- WebLLM (local): Download a model for local GPU inference
- Cloud API: Enter an API key for OpenAI, Anthropic, or OpenRouter
-
Start chatting!
- Chat with AI: WebLLM for local inference or cloud APIs
- Terminal: Full Linux shell via TinyEMU WASM
- Files: Browse and edit the browser-based filesystem
- Settings: API key management, model downloads, storage
For local models (WebLLM):
- Chrome 113+ or Edge 113+ (WebGPU required)
- GPU with 2-8GB VRAM depending on model
For cloud APIs:
- Any modern browser
- API key for OpenAI, Anthropic, or OpenRouter
The offline client uses:
- WebLLM for local model inference on your GPU
- TinyEMU RISC-V emulator compiled to WebAssembly for shell access
- IndexedDB for persistent storage (models, sessions, files)
- AES-256-GCM encryption for API key storage
The connected client is a self-contained HTML file:
index.html # Single-file connected client
The offline client consists of:
offline/
├── index.html # Main UI
├── worker.js # Web Worker for TinyEMU
├── js/
│ ├── app.js # Application logic
│ ├── storage.js # IndexedDB with encryption
│ ├── llm-router.js # WebLLM/API routing
│ ├── emulator.js # TinyEMU controller
│ ├── terminal.js # xterm.js wrapper
│ └── protocol.js # VM-LLM communication
├── assets/
│ ├── tinyemu.wasm
│ └── wasm_exec.js
└── tests/ # Test suites
- Clone the repository
- Serve with any static file server:
python -m http.server 8000 # or npx serve .
- Open
http://localhost:8000for connected mode - Open
http://localhost:8000/offline/for offline mode
MIT