Skip to content

Make endpoint configurable for use with Ollama #1

@moppman

Description

@moppman

Hi Jeff,

your page-summarizer extension looks very cool and handy, so I was wondering if you'd consider making the API endpoint configurable.

Background:
Ollama is a project that enables users to run LLMs locally on their own computers.
A couple of days ago, v0.1.24 gained compatibility with the OpenAI Chat Completions API, which (in theory) means that projects like page-summarizer only need to update their config (mostly API endpoint and models) and everything should "just work" ™️

Thanks, and take care!

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions