Skip to content

Conversation

@korenmiklos
Copy link
Member

BREAKING CHANGE: This is a major version update (0.4.3 -> 0.5.0)

Changes:

  • Replace openai dependency with litellm for multi-provider support
  • Add support for OpenAI, OpenRouter, Anthropic, and 100+ other providers
  • Add api_base parameter for custom API endpoints
  • Add **litellm_kwargs for provider-specific configuration
  • Make api_key optional (can use environment variables)
  • Update _clean_batch to use litellm.completion() with JSON mode
  • Update tests to mock litellm instead of openai
  • Update documentation (README.md and CLAUDE.md) with provider examples
  • Update example.py with multi-provider usage examples

Benefits:

  • Users can now choose from 100+ LLM providers
  • Better support for OpenRouter (access multiple models through one API)
  • More flexible API key management (environment variables or direct)
  • Consistent interface across all providers

Migration guide:

  • Existing OpenAI usage continues to work with minimal changes
  • API key can now be set via OPENAI_API_KEY environment variable
  • For other providers, use model prefixes (e.g., "openrouter/model-name")

BREAKING CHANGE: This is a major version update (0.4.3 -> 0.5.0)

Changes:
- Replace openai dependency with litellm for multi-provider support
- Add support for OpenAI, OpenRouter, Anthropic, and 100+ other providers
- Add api_base parameter for custom API endpoints
- Add **litellm_kwargs for provider-specific configuration
- Make api_key optional (can use environment variables)
- Update _clean_batch to use litellm.completion() with JSON mode
- Update tests to mock litellm instead of openai
- Update documentation (README.md and CLAUDE.md) with provider examples
- Update example.py with multi-provider usage examples

Benefits:
- Users can now choose from 100+ LLM providers
- Better support for OpenRouter (access multiple models through one API)
- More flexible API key management (environment variables or direct)
- Consistent interface across all providers

Migration guide:
- Existing OpenAI usage continues to work with minimal changes
- API key can now be set via OPENAI_API_KEY environment variable
- For other providers, use model prefixes (e.g., "openrouter/model-name")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants