Skip to content

BACKEND: AI Agent Proxy Layer #2

@tosoham

Description

@tosoham

Build the interface for LLM-powered agents:

  • Integrate Ollama (local Llama 2, Mistral, etc.) via its HTTP/CLI API

  • Expose POST /agent/response that accepts { prompt, roomId } and returns AI text

  • Add retry/fallback logic if Ollama is unavailable

Deliverable: Reliable API to “talk” to your AI models.

See:

Ollama docs: https://ollama.com/docs

llama.cpp overview: https://github.com/ggerganov/llama.cpp

Metadata

Metadata

Assignees

No one assigned

    Labels

    MMedium size issuegood first issueGood for newcomersonlydust-waveContribute to awesome OSS repos during OnlyDust's open source week

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions