Skip to content

πŸš€ Production-ready Express.js API server for OpenRouter's LLM API. Features security headers (Helmet), CORS restrictions, rate limiting, Docker support, and easy VPS deployment. Perfect starter template for building AI-powered chat applications.

License

Notifications You must be signed in to change notification settings

Shriansh2002/openrouter-express-starter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

OpenRouter Express Starter

A production-ready Express.js API server for interacting with OpenRouter's LLM API.

Features

  • βœ… Helmet middleware for security headers
  • βœ… CORS origin restrictions in production
  • βœ… Rate limiting on /api/chat endpoint
  • βœ… Server timeouts configured
  • βœ… Docker containerization support
  • βœ… Environment-based configuration
  • βœ… Health check endpoint

Prerequisites

  • Node.js 20+ (or Docker)
  • OpenRouter API key

Environment Variables

Copy env.example to .env and configure:

cp env.example .env

Required variables:

  • OPENROUTER_API_KEY - Your OpenRouter API key
  • ALLOWED_ORIGINS - Comma-separated list of allowed CORS origins (production only)
  • NODE_ENV - Set to production for production deployment
  • PORT - Server port (default: 3000)
  • OPENROUTER_DEFAULT_MODEL - Default model to use (default: x-ai/grok-4.1-fast)

Local Development

# Install dependencies
npm install

# Run in development mode
npm run dev

# Build for production
npm run build

# Run production build
npm start

Docker Deployment

Using Docker Compose (Recommended)

# Build and start the container
docker-compose up -d

# View logs
docker-compose logs -f

# Stop the container
docker-compose down

Using Docker directly

# Build the image
docker build -t openrouter-express-starter .

# Run the container
docker run -d \
  --name openrouter-express-starter \
  -p 3000:3000 \
  --env-file .env \
  openrouter-express-starter

Production Deployment on VPS (AWS EC2, DigitalOcean, etc.)

Option 1: Docker Compose

  1. SSH into your VPS

    ssh user@your-server-ip
  2. Install Docker and Docker Compose

    # Ubuntu/Debian
    curl -fsSL https://get.docker.com -o get-docker.sh
    sh get-docker.sh
    sudo usermod -aG docker $USER
    
    # Install Docker Compose
    sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
    sudo chmod +x /usr/local/bin/docker-compose
  3. Clone your repository

    git clone https://github.com/Shriansh2002/openrouter-express-starter.git
    cd openrouter-express-starter
  4. Create .env file

    cp env.example .env
    nano .env  # Edit with your values
  5. Start the application

    docker-compose up -d
  6. Set up reverse proxy (NGINX)

    server {
        listen 80;
        server_name yourdomain.com;
    
        location / {
            proxy_pass http://localhost:3000;
            proxy_http_version 1.1;
            proxy_set_header Upgrade $http_upgrade;
            proxy_set_header Connection 'upgrade';
            proxy_set_header Host $host;
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_set_header X-Forwarded-Proto $scheme;
            proxy_cache_bypass $http_upgrade;
    
            # Timeouts
            proxy_connect_timeout 30s;
            proxy_send_timeout 30s;
            proxy_read_timeout 30s;
        }
    }

Option 2: Direct Node.js Deployment

  1. Install Node.js 20+

    curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
    sudo apt-get install -y nodejs
  2. Clone and build

    git clone https://github.com/Shriansh2002/openrouter-express-starter.git
    cd openrouter-express-starter
    npm ci
    npm run build
  3. Set up PM2 for process management

    npm install -g pm2
    pm2 start dist/server.js --name openrouter-express-starter
    pm2 save
    pm2 startup  # Follow instructions to enable auto-start
  4. Configure environment variables

    cp env.example .env
    nano .env

Security Features

  • Helmet: Sets various HTTP headers for security
  • CORS: Restricted origins in production mode
  • Rate Limiting: 50 requests per 15 minutes per IP (production)
  • Timeouts: 30s request timeout, 65s keep-alive timeout
  • Non-root user: Docker container runs as non-root user

API Endpoints

POST /api/chat

Send a chat request to the LLM.

Request Body:

{
	"messages": [
		{
			"role": "user",
			"content": "Hello, how are you?"
		}
	],
	"model": "x-ai/grok-4.1-fast", // optional
	"stream": false // optional, currently not supported
}

Response:

{
	"model": "x-ai/grok-4.1-fast",
	"message": {
		"role": "assistant",
		"content": "I'm doing well, thank you!"
	},
	"usage": {
		"prompt_tokens": 10,
		"completion_tokens": 8,
		"total_tokens": 18
	}
}

GET /health

Health check endpoint.

Response:

{
	"status": "ok"
}

Rate Limiting

  • Production: 50 requests per 15 minutes per IP
  • Development: 100 requests per 15 minutes per IP

Rate limit headers are included in responses:

  • RateLimit-Limit: Maximum requests allowed
  • RateLimit-Remaining: Remaining requests in current window
  • RateLimit-Reset: Time when the rate limit resets

Monitoring

The application includes a health check endpoint at /health that can be used with monitoring tools or load balancers.

License

MIT

About

πŸš€ Production-ready Express.js API server for OpenRouter's LLM API. Features security headers (Helmet), CORS restrictions, rate limiting, Docker support, and easy VPS deployment. Perfect starter template for building AI-powered chat applications.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published