Skip to content

Refactor AI Service to Interface based Provider Pattern #102

@AnkanMisra

Description

@AnkanMisra

Problem

The AI generation logic is tightly coupled to OpenRouter in gateway/main.go. This makes it impossible to:

  1. Switch to OpenAI/Anthropic directly.
  2. Use a local LLM (Ollama) for "Micro" deployments or free development.
  3. Mock the AI provider effectively for unit tests.

Solution

Refactor the AI logic into a dedicated services/ai package with a generic LLMProvider interface.

Architecture

classDiagram
    class Provider {
        <<interface>>
        +Generate(ctx, prompt) string
    }
    
    class OpenRouterProvider {
        +Generate(ctx, prompt)
    }
    
    class OllamaProvider {
        +Generate(ctx, prompt)
    }
    
    class MockProvider {
        +Generate(ctx, prompt)
    }
    
    Provider <|-- OpenRouterProvider
    Provider <|-- OllamaProvider
    Provider <|-- MockProvider
    
    class Gateway {
        -provider Provider
        +handleSummarize()
    }
    
    Gateway --> Provider
Loading

Implementation

  1. Create gateway/internal/ai/ package.
  2. Implement OpenRouterProvider (move logic from callOpenRouter).
  3. Implement OllamaProvider (connect to http://localhost:11434).
  4. Update main.go to initialize the provider based on AI_PROVIDER env var.

Acceptance Criteria

  • callOpenRouter function is removed from main.go
  • AI_PROVIDER env var selects the backend (default: openrouter)
  • Ollama Support: Can connect to a local Ollama instance
  • Existing OpenRouter functionality remains unchanged
  • Unit tests for the Provider factory and implementations

Testing

# Test OpenRouter
AI_PROVIDER=openrouter go run main.go

# Test Ollama
AI_PROVIDER=ollama OLLAMA_URL=http://localhost:11434 go run main.go

Metadata

Metadata

Assignees

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions