A multi-language workspace for testing AI providers (OpenAI, Anthropic, Google Gemini) with consistent patterns across Python, Node.js, and C++.
This repository provides reference implementations for integrating with major AI providers in three programming languages. Each environment features:
- Environment-based Configuration (development/production)
- Multi-Provider Support (OpenAI, Anthropic, Google Gemini)
- Unified Client Pattern (auto-selection with fallback)
- Cost Tracking & Limits
``` ai_testing/ ├── README.md # This file ├── python/ # Python environment │ ├── venv/ # Virtual environment │ ├── requirements.txt # Dependencies │ ├── config/ # Configuration system │ ├── clients/ # AI provider clients │ ├── frameworks/ # LangChain, LlamaIndex integrations │ ├── storage/ # Vector stores (ChromaDB) │ ├── tracking/ # Usage tracking │ ├── tests/ # Test suite │ └── docs/ # Documentation ├── nodejs/ # Node.js environment │ ├── package.json # Dependencies │ ├── src/config/ # Configuration system │ ├── src/clients/ # AI provider clients │ └── src/index.js # Entry point └── cpp/ # C++ environment ├── CMakeLists.txt # Build configuration ├── include/ # Header files │ ├── config/ # Config structures │ └── clients/ # Client interfaces └── src/ # Source files ```
```bash cd python source venv/bin/activate python -c "from clients.ai_clients import UnifiedClient; print(UnifiedClient().chat('Hello!'))" ```
```bash cd nodejs npm install node src/index.js ```
```bash cd cpp mkdir build && cd build cmake .. && cmake --build . ./ai_testing ```
All three environments use the same environment variables:
```bash
export OPENAI_API_KEY="sk-..." export ANTHROPIC_API_KEY="sk-ant-..." export GEMINI_API_KEY="..."
export ENVIRONMENT=development # or 'production' ```
| Setting | Development | Production |
|---|---|---|
| OpenAI Model | gpt-3.5-turbo | gpt-4o |
| Anthropic Model | claude-3-haiku | claude-sonnet-4 |
| Google Model | gemini-1.5-flash | gemini-1.5-pro |
| Daily Cost Limit | $5.00 | $100.00 |
| Cache TTL | 24 hours | 1 hour |
| Log Level | DEBUG | INFO |
| Max Tokens | 512-1024 | 4096-8192 |
All environments implement a `UnifiedClient` that automatically selects an available provider:
```python
from clients.ai_clients import UnifiedClient client = UnifiedClient() response = client.chat("Hello!") ```
```javascript // Node.js import { UnifiedClient } from './clients/ai-clients.js'; const client = new UnifiedClient(); const response = await client.chat("Hello!"); ```
```cpp // C++ clients::UnifiedClient client; auto response = client.chat("Hello!"); ```
```python
from config import get_config config = get_config() print(config.models.openai_default) ```
```javascript // Node.js import { getConfig } from './config/index.js'; const config = getConfig(); console.log(config.models.openaiDefault); ```
```cpp // C++ auto cfg = config::get_config(); std::cout << cfg.models.openai_default << std::endl; ```
| Feature | Python | Node.js | C++ |
|---|---|---|---|
| OpenAI SDK | ✓ | ✓ | REST |
| Anthropic SDK | ✓ | ✓ | REST |
| Google SDK | ✓ | ✓ | REST |
| LangChain | ✓ | - | - |
| LlamaIndex | ✓ | - | - |
| Vector Store | ✓ (ChromaDB) | - | - |
| Usage Tracking | ✓ | Basic | Basic |
| Key Rotation | ✓ | - | - |
This workspace is designed as a reference implementation for:
- Learning - Understanding AI API integration patterns
- Prototyping - Quick testing of AI features
- Comparison - Evaluating providers and languages
- Templates - Starting point for new projects
Open Source - Free to use, modify, and distribute.