Your own Claude Code UI. Open source, self-hosted, runs entirely on your machine.
Join our Discord server to get help, share feedback, and connect with other users.
- Multiple sandboxes - Docker (local), E2B (cloud), or Modal (cloud).
- Use your own plans - Claude Max, OpenRouter, OpenAI, or custom providers.
- Full IDE experience - VS Code in browser, terminal, file explorer.
- Extensible - Skills, agents, slash commands, MCP servers.
git clone https://github.com/Mng-dev-ai/claudex.git
cd claudex
docker compose up -dSee Desktop Setup to run Claudex as a native macOS app.
Run AI agents in isolated environments with multiple sandbox providers:
- Docker - Fully local, no external dependencies
- E2B - Cloud sandboxes with e2b.dev
- Modal - Serverless cloud sandboxes with modal.com
- VS Code editor in the browser
- Terminal with full PTY support
- File system management
- Port forwarding for web previews
- Environment checkpoints and snapshots
View and interact with a browser running inside the sandbox via VNC. Use Playwright MCP with Chrome DevTools Protocol (CDP) to let Claude control the browser programmatically.
Switch between providers in the same chat:
- Anthropic - Use your Max plan
- OpenAI - Use your ChatGPT Pro subscription (GPT-5.2 Codex, GPT-5.2)
- OpenRouter - Access to multiple model providers
- Custom - Any Anthropic-compatible API endpoint
- Custom Skills - ZIP packages with YAML metadata
- Custom Agents - Define agents with specific tool configurations
- Slash Commands - Built-in (
/context,/compact,/review,/init) - MCP Servers - Model Context Protocol support (NPX, BunX, UVX, HTTP)
Automate recurring tasks with Celery workers.
- Fork chats from any message point
- Restore to any previous message in history
- File attachments with preview
- Web preview for running applications
- Mobile viewport simulation
- File previews: Markdown, HTML, images, CSV, PDF, PowerPoint
- Browse and install plugins from official catalog
- One-click installation of agents, skills, commands, MCPs
- Environment variables for sandbox execution
- Gmail - Read, send, and manage emails via Gmail MCP Server
- Go to Google Cloud Console
- Create a project and enable the Gmail API
- Create OAuth credentials (Desktop app for localhost, Web application for hosted URLs)
- If using Web application, add redirect URI:
https://YOUR_DOMAIN/api/v1/integrations/gmail/callback - Download the JSON credentials file
- In Claudex Settings → Integrations, upload your credentials file
- Click Connect Gmail to authorize
- System prompts for global context
- Custom instructions injected with each message
Configure providers in Settings → Providers after login.
All providers use Claude Code under the hood. Non-Anthropic providers work through Anthropic Bridge, which translates Anthropic API calls to other providers.
┌─────────────┐ ┌───────────────────┐ ┌───────────────────────┐
│ Claudex │────▶│ Anthropic Bridge │────▶│ OpenAI / OpenRouter │
│ │ │ (API Translator) │ │ / Custom │
└─────────────┘ └───────────────────┘ └───────────────────────┘
This means all providers share the same conversation history stored in ~/.claude JSONL files, plus the same slash commands, skills, agents, and MCP servers. You can develop a feature with Claude, then switch to GPT-5.2 Codex for review—it already has the full context without needing to re-read files.
| Provider | Auth Method | Models |
|---|---|---|
| Anthropic | OAuth token from claude setup-token |
Sonnet 4.5, Opus 4.5, Haiku 4.5 |
| OpenAI | Auth file from codex login |
GPT-5.2 Codex, GPT-5.2 |
| OpenRouter | API key | Multiple providers |
| Custom | API key | Any Anthropic-compatible endpoint |
Use OpenAI models with your ChatGPT Pro subscription:
- Install Codex CLI
- Run
codex loginand authenticate with your ChatGPT account - In Claudex Settings → Providers, add an OpenAI provider
- Upload your
~/.codex/auth.jsonfile
Use any Anthropic-compatible API endpoint:
- Get your API key from your provider
- In Claudex Settings → Providers, add a Custom provider
- Enter your API endpoint URL and API key
Compatible coding plans:
You only need one AI provider configured.
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Frontend │────▶│ FastAPI │────▶│ PostgreSQL │
│ React/Vite │ │ Backend │ │ Database │
└─────────────────┘ └────────┬────────┘ └─────────────────┘
│
┌────────────┼────────────┐
▼ ▼ ▼
┌───────────┐ ┌───────────┐ ┌─────────────────┐
│ Redis │ │ Celery │ │ Docker Sandbox │
│ Pub/Sub │ │ Workers │ │ │
└───────────┘ └───────────┘ └─────────────────┘
Frontend: React 19, TypeScript, Vite, TailwindCSS, Zustand, React Query, Monaco Editor, XTerm.js
Backend: FastAPI, Python 3.13, SQLAlchemy 2.0, Celery, Redis, Granian
| Service | Port |
|---|---|
| Frontend | 3000 |
| Backend API | 8080 |
| PostgreSQL | 5432 |
| Redis | 6379 |
docker compose up -d # Start
docker compose down # Stop
docker compose logs -f # LogsFor production deployment on a VPS, see the Coolify Installation Guide.
- API Docs: http://localhost:8080/api/v1/docs
- Admin Panel: http://localhost:8080/admin
Default admin: admin@example.com / admin123
- Fork the repository
- Create a feature branch
- Commit your changes
- Open a Pull Request
Apache 2.0 - see LICENSE

