Let AI help you explore databases and generate SQL — as simple as chatting with an expert
🇺🇸 English | 🇨🇳 简体中文
Key Highlights • Quick Start • Features • Tech Stack
Unlike simple "text-to-SQL" tools, TableChat's Agent Mode lets AI work like a real database expert:
| Capability | Description |
|---|---|
| 🔍 Autonomous Exploration | AI proactively examines table structures and understands relationships |
| 💭 Transparent Thinking | Watch AI's reasoning process and tool calls in real-time |
| 🛠️ Smart Tools | List tables, check schemas, run test queries — step by step |
| ✅ Any SQL | SELECT, CREATE INDEX, ALTER TABLE — all supported |
💡 Example: Creating an Index
👤 User: Help me add an index on user_id for the orders table
🤖 Agent thinking...
├─ 🔧 list_tables → Found orders, users, products...
├─ 🔧 get_table_schema("orders") → Found user_id column
└─ 💡 Generated: CREATE INDEX idx_orders_user_id ON orders(user_id);
✅ SQL generated, click to copy to editor
Start in 30 seconds — no Python/Node environment needed!
# 1. Clone the project
git clone https://github.com/your-username/tableChat.git
cd tableChat
# 2. Configure API Key
cp .env.example .env
# Edit .env and add your LLM_API_KEY (see "Environment Variables" section below)
# 3. One-click start
docker compose up -d
# 🎉 Done!
# Frontend: http://localhost:5888
# API: http://localhost:7888/docs📋 Common Commands
docker compose ps # Check status
docker compose logs -f # View logs
docker compose down # Stop services
docker compose up --build # Rebuild💡 AI automatically explores table structures → Executes validation queries → Generates precise SQL → Outputs in Markdown format
🔧 Collapsible tool call blocks showing the complete
list_tables→get_table_schema→query_databasechain
| SQL Editor | Natural Language | Agent Mode |
|---|---|---|
| Monaco Editor | Quick generation for simple scenarios | Intelligent exploration for complex scenarios |
| Syntax highlighting, auto-completion | Two-stage prompt chain optimization | Real-time streaming output |
| Ctrl+Enter to execute | Supports large databases | Collapsible tool calls |
|
|
TableChat's Agent Mode is a core feature that requires strong tool use capabilities from the LLM. After testing, Anthropic Claude performs best in Agent scenarios:
- 🧠 More Precise Tool Calls — Claude accurately understands when to call which tool
- 🔗 Better Multi-step Reasoning — Correctly chains
list_tables→get_schema→querysteps in complex scenarios - 📝 Clearer Chain of Thought — More readable and organized reasoning output
Therefore, TableChat backend uniformly uses Anthropic SDK.
However, we understand that many users want to use other LLM services (like vLLM, Azure OpenAI, locally deployed models, etc.). To support OpenAI-compatible services, we introduced claude-code-proxy as a unified entry point:
┌─────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ TableChat │ Anthropic API │ claude-code-proxy│ Anthropic/ │ LLM Service │
│ (Backend) │ ─────────────────> │ (Proxy) │ OpenAI API │ (Claude/vLLM) │
└─────────────┘ └──────────────────┘ ─────────────────> └─────────────────┘
↑
Unified entry point
for all requests
Advantages:
- ✅ Simple backend code — Only maintain one set of Anthropic SDK code
- ✅ Unified configuration — Switch LLMs by changing environment variables, no code changes
- ✅ One-click deployment —
docker compose upautomatically starts the proxy
| Variable | Description | Default |
|---|---|---|
LLM_API_KEY |
API Key (required) | - |
LLM_MODEL |
Model to use | claude-sonnet-4-5-20250929 |
UPSTREAM_API_TYPE |
Upstream type: anthropic or openai |
anthropic |
UPSTREAM_API_BASE |
Upstream API URL (optional) | Auto-selected based on type |
Direct use of Claude API for best Agent performance:
LLM_API_KEY=sk-ant-api03-xxxxx
# That's it! One-click start:
docker compose upConnect to vLLM, LM Studio, Ollama, etc.:
LLM_API_KEY=your-key
UPSTREAM_API_TYPE=openai
UPSTREAM_API_BASE=http://your-server:8000/v1
# ⚠️ Model name needs openai/ prefix
LLM_MODEL=openai/qwen/qwen3-4b-2507
# Same one-click start:
docker compose up
⚠️ Note: Agent performance in OpenAI compatible mode depends on the model's Tool Use capability. GPT-4o or equivalent models are recommended.
Legacy variables are still supported: AGENT_API_KEY, AGENT_API_BASE, AGENT_MODEL
|
Python 3.13 + FastAPI Anthropic SDK asyncpg / aiomysql SQLite + FTS5 |
React 19 + TypeScript Ant Design 5 Monaco Editor Refine 5 |
Docker Compose Nginx Health Check Volume Persistence |
tableChat/
├── backend/ # Python backend
│ ├── app/
│ │ ├── api/v1/ # API routes (including agent endpoints)
│ │ ├── services/ # Business logic (agent_service, agent_tools)
│ │ ├── connectors/ # Database connectors
│ │ └── models/ # Pydantic models
│ └── Dockerfile
├── frontend/ # React frontend
│ ├── src/
│ │ ├── components/
│ │ │ ├── agent/ # 🤖 Agent mode components
│ │ │ ├── editor/ # SQL editor
│ │ │ └── ...
│ │ └── pages/
│ └── Dockerfile
└── docker-compose.yml # One-click deployment
- 🤖 Agent Mode — Claude-powered intelligent database exploration
- 💬 Natural Language Query — Two-stage prompt chain, supports large databases
- 🗄️ Multi-database Support — PostgreSQL + MySQL
- 📊 Multi-format Export — CSV / JSON / XLSX
- 📜 Query History — Full-text search (FTS5)
- 🔐 SSH Tunnel — Secure connection to internal databases
- 🔌 Unified LLM API — Anthropic + OpenAI compatible mode
- 🐳 One-click Deployment — Docker Compose out of the box
- 📝 Query bookmarks and sharing
- 🎨 Custom themes
- 👥 Multi-user support
- 🔒 Permission management
- 📈 Query performance analysis
MIT License
⭐ If you find this useful, please give it a Star ⭐
Made with ❤️ by the TableChat Team


