-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers
Description
Create backend API routes for authenticated users to:
- Create, read, and update their AI model settings.
- Support multiple AI providers and models.
- Store API keys encrypted for security.
- Mark each model as paid or free.
- Set a system-wide AI assistant prompt.
Details
- Users can save their AI providers, models, and encrypted API keys.
- Users can get their current configuration with encrypted API keys in the response.
- Users can update their settings anytime.
- API keys are encrypted and never exposed in plain text.
- Only logged-in users can access their own data.
1. Create AI Model Configuration
Create a secure API endpoint that allows a logged-in user to save their AI model configuration.
Endpoint
POST /api/v1/create/ai-model-configRequest Body
{
"models": [
{
"provider": "OpenAI",
"type": "chat",
"model": "gpt-4",
"apiKeyEncrypted": "<encrypted-api-key-openai>",
"isPaid": true
},
{
"provider": "Anthropic",
"type": "chat",
"model": "claude-2",
"apiKeyEncrypted": "<encrypted-api-key-anthropic>",
"isPaid": true
},
{
"provider": "Ollama",
"type": "chat",
"model": "llama3",
"apiKeyEncrypted": null,
"isPaid": false
}
],
"system_prompt": "You are a helpful AI assistant."
}
🔹 2. GET Ai model Config
Implement a secure API route that allows an authenticated user to retrieve their saved AI model configuration.
Endpoint
GET /api/v1/get/ai-model-configThis configuration includes:
- A list of AI providers and their selected models
- A
isPaidflag for each model (to manage billing or usage restrictions) - A customizable
system_promptused to influence AI assistant behavior
🧾 Example Response
✅ Standard Response (No Encrypted Keys Returned)
{
"models": [
{
"provider": "OpenAI",
"type": "chat",
"model": "gpt-4",
"isPaid": true
},
{
"provider": "Anthropic",
"type": "chat",
"model": "claude-2",
"isPaid": true
},
{
"provider": "Ollama",
"type": "chat",
"model": "llama3",
"isPaid": false
}
],
"system_prompt": "You are a helpful AI assistant."
}
🔐 Optional: Response Including Encrypted Keys (If needed by client)
{
"models": [
{
"provider": "OpenAI",
"type": "chat",
"model": "gpt-4",
"isPaid": true,
"apiKeyEncrypted": "<encrypted-string>"
}
],
"system_prompt": "You are a helpful AI assistant."
}🔹 3. Update AI Model Configration
Endpoint
PUT /api/v1/update/ai-model-configProvide an API endpoint that allows a logged-in user to update their existing AI model configuration. This route enables users to modify:
- AI provider or model details
- API keys (must be encrypted)
- Paid/free flags
- The system-wide AI assistant prompt
This ensures the configuration remains flexible, secure, and personalized to the user’s preferences.
🧾 Request Body
The structure is the same as the POST route. Includes:
{
"models": [
{
"provider": "OpenAI",
"type": "chat",
"model": "gpt-4",
"apiKeyEncrypted": "<new-or-existing-encrypted-key>",
"isPaid": true
},
{
"provider": "Ollama",
"type": "chat",
"model": "llama3",
"apiKeyEncrypted": null,
"isPaid": false
}
],
"system_prompt": "You are a helpful AI assistant."
}Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers