Skip to content

Conversation

@korenmiklos
Copy link
Member

  • Replace direct OpenAI dependency with llm library for multi-provider support
  • Update DataCleaner to use llm.get_model() and structured output via schema parameter
  • Change initialization from api_key to model parameter (default: gpt-4o-mini)
  • Refactor batch processing to combine system/user messages for LLM API format
  • Update tests to mock llm.get_model instead of OpenAI client
  • Fix Pydantic deprecation warning using class.model_fields
  • Add comprehensive setup documentation for LLM API configuration
  • Bump version to 0.5.0 to reflect breaking API changes

Closes #17

🤖 Generated with Claude Code

- Replace direct OpenAI dependency with llm library for multi-provider support
- Update DataCleaner to use llm.get_model() and structured output via schema parameter
- Change initialization from api_key to model parameter (default: gpt-4o-mini)
- Refactor batch processing to combine system/user messages for LLM API format
- Update tests to mock llm.get_model instead of OpenAI client
- Fix Pydantic deprecation warning using __class__.model_fields
- Add comprehensive setup documentation for LLM API configuration
- Bump version to 0.5.0 to reflect breaking API changes

Closes #17

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Explore other backends

2 participants