A modern command-line tool for calling multiple LLM APIs (DeepSeek, Qwen, etc.) with an elegant interface.
- ✨ Modern CLI - Built with Typer and Rich
- 🔧 Type Safe - Full type hints and Pydantic validation
- 📊 Progress Bars - Visual feedback for file operations
- 📝 Rich Logging - Powered by Loguru
- 💬 Interactive Chat - Multi-turn conversations with command support
- 🔌 Multiple Providers - Support for OpenAI-compatible APIs
- 📦 Batch Processing - Process multiple tasks concurrently with multi-threading
# Clone repository
git clone <repository-url>
cd ask_llm
# Install dependencies
pip install -r requirements.txt
# Install in development mode
pip install -e .# Create example configuration
ask-llm config init
# Edit config.json with your API keys
# Then verify
ask-llm config test# Process a file
ask-llm ask input.txt
# Direct text input
ask-llm "Translate to Chinese: Hello world"
# Interactive chat mode
ask-llm chat
# With initial context
ask-llm chat -i context.txt -s "You are a helpful assistant"
# Batch processing
ask-llm batch batch-examples/prompt-contents.yml -o results.json| Command | Description |
|---|---|
ask-llm ask [INPUT] |
Process input with LLM |
ask-llm chat |
Start interactive chat |
ask-llm batch [CONFIG] |
Process batch tasks from YAML config |
ask-llm config show |
Display configuration |
ask-llm config test |
Test API connections |
ask-llm config init |
Create example config |
The batch command supports processing multiple tasks concurrently:
# Basic usage
ask-llm batch batch-examples/prompt-contents.yml
# With options
ask-llm batch config.yml -o results.json -f json --threads 10 --retries 5See docs/BATCH_USAGE.md for detailed batch processing documentation.
ask_llm/
├── src/ask_llm/ # Main package
│ ├── cli.py # CLI entry point
│ ├── core/ # Core logic
│ ├── providers/ # API providers
│ ├── config/ # Configuration
│ └── utils/ # Utilities
├── tests/ # Tests
├── docs/ # Documentation
└── config.json # Configuration file
# Run tests
pytest
# Run with coverage
pytest --cov=src/ask_llm
# Type checking
mypy src/ask_llm
# Linting
ruff check src/ask_llm
ruff format src/ask_llmSee docs/README_ask_llm.md for detailed documentation.
MIT License