Skip to content

Releases: Stream29/ProxyAsLocalModel

v0.0.8

15 Nov 08:51
1f4bd31

Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.0.7...v0.0.8

v0.1.0-alpha

26 Jul 15:42
95a50fe

Choose a tag to compare

v0.1.0-alpha Pre-release
Pre-release

What's Changed

Full Changelog: v0.0.7...v0.1.0-alpha

v0.0.7

07 May 01:28
db51926

Choose a tag to compare

Full Changelog: v0.0.6...v0.0.7

Fix issue #6

v0.0.6

06 May 10:16
f4d9537

Choose a tag to compare

Full Changelog: v0.0.5...v0.0.6

Fix issue #6 by merging messages.

Better error handling (When exception thrown by ktor client, the request body will appear in the log)

v0.0.5

03 May 20:52
e71ad4e

Choose a tag to compare

What's Changed

  • feat(server): add OpenRouter support by @Stream29 in #5

Full Changelog: v0.0.4...v0.0.5

v0.0.4

03 May 15:54
933dc81

Choose a tag to compare

What's Changed

  • feat: configure custom path in config.yml by @Stream29 in #3

Full Changelog: v0.0.3...v0.0.4

v0.0.3

03 May 11:18
e4ddf24

Choose a tag to compare

Full Changelog: v0.0.2...v0.0.3

Correct baseUrl of DeepSeek.

Add Claude support.

v0.0.2

03 May 09:22
47aecf8

Choose a tag to compare

Full Changelog: v0.0.1...v0.0.2

feat: add KtorClientConfig for customizable HTTP client settings

v0.0.1

02 May 01:07
5737f14

Choose a tag to compare

Proxy remote LLM API as Local model. Especially works for using custom LLM in JetBrains AI Assistant.

Currently supports:

Proxy from: OpenAI, Dashscope(Alibaba Qwen), Gemini, Deepseek, Mistral, SiliconFlow.

Proxy as: LM Studio, Ollama.

Streaming chat completion API only.