-
Notifications
You must be signed in to change notification settings - Fork 34
Open
Labels
Description
Problem
The AI generation logic is tightly coupled to OpenRouter in gateway/main.go. This makes it impossible to:
- Switch to OpenAI/Anthropic directly.
- Use a local LLM (Ollama) for "Micro" deployments or free development.
- Mock the AI provider effectively for unit tests.
Solution
Refactor the AI logic into a dedicated services/ai package with a generic LLMProvider interface.
Architecture
classDiagram
class Provider {
<<interface>>
+Generate(ctx, prompt) string
}
class OpenRouterProvider {
+Generate(ctx, prompt)
}
class OllamaProvider {
+Generate(ctx, prompt)
}
class MockProvider {
+Generate(ctx, prompt)
}
Provider <|-- OpenRouterProvider
Provider <|-- OllamaProvider
Provider <|-- MockProvider
class Gateway {
-provider Provider
+handleSummarize()
}
Gateway --> Provider
Implementation
- Create
gateway/internal/ai/package. - Implement
OpenRouterProvider(move logic fromcallOpenRouter). - Implement
OllamaProvider(connect tohttp://localhost:11434). - Update
main.goto initialize the provider based onAI_PROVIDERenv var.
Acceptance Criteria
-
callOpenRouterfunction is removed frommain.go -
AI_PROVIDERenv var selects the backend (default:openrouter) - Ollama Support: Can connect to a local Ollama instance
- Existing OpenRouter functionality remains unchanged
- Unit tests for the Provider factory and implementations
Testing
# Test OpenRouter
AI_PROVIDER=openrouter go run main.go
# Test Ollama
AI_PROVIDER=ollama OLLAMA_URL=http://localhost:11434 go run main.goReactions are currently unavailable