A UV-based Python project that provides an OpenAI agent with access to BigQuery via MCP (Model Context Protocol) tools running on Google Cloud Run with IAM authentication.
┌─────────────────┐
│ OpenAI Agent │
│ (GPT-4) │
└────────┬────────┘
│
│ Function Calling
│
┌────────▼────────┐
│ MCP Client │
│ (auth + HTTP) │
└────────┬────────┘
│
│ IAM Auth + HTTPS
│
┌────────▼────────────────────────────────────────┐
│ Cloud Run MCP Service │
│ https://bigquery-mcp-prod-729200795925... │
│ │
│ ┌──────────────┐ │
│ │ BigQuery MCP │ │
│ │ Tools │ │
│ └──────────────┘ │
└─────────────────────────────────────────────────┘
- 🤖 OpenAI Agent: Uses GPT-4 with function calling for task automation
- 🔐 IAM Authentication: Secure access to Cloud Run with automatic token refresh
- 🛠️ MCP Integration: Automatic tool discovery and execution
- 🐳 Docker Support: Complete Docker Compose setup for easy deployment
- 🔄 Token Management: Automatic token refresh before expiration
- 📝 Multiple Auth Methods: Supports ADC, service account keys, and Workload Identity
- Python 3.11+
- UV: Fast Python package installer (Installation)
- Docker & Docker Compose (for containerized deployment)
- Google Cloud SDK (for authentication)
- OpenAI API Key
# Navigate to project directory
cd client-mcp
# Install dependencies using UV
uv syncOption A: Application Default Credentials (ADC) - Recommended
# Authenticate with Google Cloud
gcloud auth application-default login
# Verify authentication
gcloud auth application-default print-access-tokenOption B: Service Account Key File
- Create a service account in GCP Console
- Grant it the
Cloud Run Invokerrole - Download the JSON key file
- Set environment variable:
export SERVICE_ACCOUNT_KEY_PATH=/path/to/key.jsonOption C: Workload Identity (for GKE/Cloud Run)
See commented examples in src/auth_handler.py for detailed setup instructions.
# Create .env file
echo "OPENAI_API_KEY=your-openai-api-key-here" > .env# Activate virtual environment
source .venv/bin/activate
# Run the application
python src/main.py
# Or with a custom task
python src/main.py "Show me the schema of the users table"# Build and run
docker compose up
# Run in detached mode
docker compose up -d
# View logs
docker compose logs -f
# Stop
docker compose downEdit docker-compose.yml and modify the command:
command: python src/main.py "Your custom task here"# In main.py, set task_description to:
"List all available BigQuery datasets and tables""Query the sales data for the last 30 days and provide a summary""Show me the schema of the users table in the analytics dataset""Analyze the top 10 products by revenue in Q4 and create a summary report with insights"client-mcp/
├── pyproject.toml # UV project configuration
├── docker-compose.yml # Docker Compose configuration
├── Dockerfile # Container image definition
├── .env # Environment variables (create this)
├── .gitignore # Git ignore patterns
├── .dockerignore # Docker ignore patterns
├── README.md # This file
└── src/
├── __init__.py # Package initialization
├── main.py # Application entry point
├── auth_handler.py # Cloud Run IAM authentication
├── mcp_client.py # MCP protocol client
└── agent.py # OpenAI agent with tool calling
Best for: Local development, GCE, Cloud Run, GKE
handler = CloudRunTokenHandler()Setup:
gcloud auth application-default loginBest for: CI/CD, specific service account requirements
handler = CloudRunTokenHandler(
service_account_key_path="/path/to/key.json"
)Setup:
- Create service account
- Grant
roles/run.invoker - Download JSON key
- Set
SERVICE_ACCOUNT_KEY_PATHenv var
Best for: Production deployments in GKE/Cloud Run
Automatically configured when running in GKE or Cloud Run with proper IAM bindings.
See src/auth_handler.py for detailed setup instructions with example YAML and
commands.
| Variable | Required | Description |
|---|---|---|
OPENAI_API_KEY |
Yes | OpenAI API key for GPT-4 access |
SERVICE_ACCOUNT_KEY_PATH |
No | Path to service account JSON key (for Method 2) |
GOOGLE_APPLICATION_CREDENTIALS |
No | Path to credentials file (auto-detected) |
Problem: "Failed to obtain valid credentials"
Solutions:
- Run
gcloud auth application-default login - Check if service account has
roles/run.invokerpermission - Verify
GOOGLE_APPLICATION_CREDENTIALSpoints to valid file - Ensure Docker has access to credentials (check volume mounts)
Problem: "MCP service health check failed"
Solutions:
- Verify Cloud Run service URL is correct
- Check network connectivity
- Ensure service account has proper permissions
- Check Cloud Run service is deployed and running
Problem: "Failed to refresh token"
Solutions:
- Re-authenticate:
gcloud auth application-default login - Check token hasn't been revoked
- Verify service account key is still valid
- Check system clock is synchronized
Problem: "OpenAI API key not set" or rate limit errors
Solutions:
- Verify
.envfile exists withOPENAI_API_KEY - Check API key is valid at platform.openai.com
- Ensure you have sufficient credits/quota
- Implement rate limiting if needed
# Add to pyproject.toml first
# Then sync
uv sync# Install dev dependencies
uv sync --dev
# Run tests
pytest# Format code
ruff format .
# Lint code
ruff check .- Never commit
.envfiles or service account keys to version control - Rotate service account keys regularly
- Use least-privilege IAM roles
- Enable audit logging for Cloud Run access
- Use Workload Identity in production when possible
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
MIT License - see LICENSE file for details
For issues and questions:
- Check the troubleshooting section above
- Review commented examples in source code
- Open an issue on GitHub
- Built with UV for fast Python package management
- Uses OpenAI's function calling API
- Implements MCP (Model Context Protocol)
- Secured with Google Cloud IAM