Releases: Stream29/ProxyAsLocalModel
Releases · Stream29/ProxyAsLocalModel
v0.0.8
v0.1.0-alpha
v0.0.7
v0.0.6
v0.0.5
v0.0.4
v0.0.3
v0.0.2
Full Changelog: v0.0.1...v0.0.2
feat: add KtorClientConfig for customizable HTTP client settings
v0.0.1
Proxy remote LLM API as Local model. Especially works for using custom LLM in JetBrains AI Assistant.
Currently supports:
Proxy from: OpenAI, Dashscope(Alibaba Qwen), Gemini, Deepseek, Mistral, SiliconFlow.
Proxy as: LM Studio, Ollama.
Streaming chat completion API only.