A fully free, open-source AI council system built with LangChain and Ollama. This beautiful CLI-based multi-agent system uses 4 specialized AI agents to analyze problems through parallel responses, natural group chat discussions, and weighted synthesis - completely free, no API costs, runs entirely locally.
- ✅ No API costs - Runs entirely on your local machine with Ollama
- ✅ No cloud dependencies - All processing happens locally
- ✅ Open source - MIT licensed, fully customizable
- ✅ Privacy-first - Your data never leaves your machine
- ✅ Powered by LangChain - Industry-standard AI framework
- ✅ Powered by Ollama - Free, local LLM inference
- 🆓 100% Free - No API costs, runs entirely locally with Ollama
- 🎨 Beautiful CLI with rich formatting and colors
- 💬 Natural group chat conversations between agents
- 🏷️ Tag agents with @mentions for focused debates
- ⚖️ Weighted decision model for final recommendations
- 🔍 Comprehensive synthesis analysis
- 🚀 Parallel agent processing for speed
- 📊 Structured output with agreements, conflicts, and blind spots
- 🔒 Privacy-first - All data stays on your machine
- 🛠️ Built with LangChain - Industry-standard AI framework
The system consists of 4 specialized agents, each with a distinct persona and thinking layer:
- Elon (Visionary) - First-principles thinking, innovation, bold direction
- Sam (Strategist) - Business model, market realities, scalability
- Sheryl (Operator) - Practical execution, system design, reliability
- Ray (Risk Analyst) - Red-team, failure modes, blind spots
-
Python 3.8+ installed
-
Ollama (free, local LLM runtime) - Download from ollama.ai
# Install Ollama, then start it: ollama serve -
Pull required models (all free):
# These models run entirely locally - no API costs! ollama pull gpt-oss:120b-cloud ollama pull glm-4.6:cloud ollama pull kimi-k2-thinking:cloud ollama pull deepseek-v3.1:671b-cloudNote: Model downloads are free. They run locally on your machine.
# Clone the repository
git clone https://github.com/Hrishikeshgupta2002/AI-Council.git
cd AI-Council
# Install dependencies
pip install -r requirements.txt
# Or install as a package
pip install -e .python main.pyEnter your problem statement when prompted, and the council will analyze it.
# Direct input
python main.py "Your problem statement here"
# Piped input
echo "Your problem statement" | python main.py- Normal message: Type your message, all agents respond
- Tag agents: Use
@Elon @Sam debate this topicto start a focused debate - Continue conversation: Press Enter to continue the general conversation
- Exit: Type
exit,quit, orqto end
$ python main.py
Enter your problem statement: Should we build this feature?
# Agents respond...
> @Elon @Sam debate the technical approach
# Tagged agents have a focused 2-3 exchange debate
> # Press Enter for another round
# Agents can choose to respond or skipSet environment variables to customize behavior:
export OLLAMA_BASE_URL="http://localhost:11434" # Ollama server URL
export USE_WEIGHTED_MODEL="true" # Use weighted model (default: true)
export AGENT_TIMEOUT="60" # Agent response timeout in seconds
export MAX_WORKERS="4" # Max parallel workers
export DEBUG="true" # Show detailed error tracesagentic-council/
├── agents/ # Agent implementations
│ ├── __init__.py
│ ├── visionary.py
│ ├── strategist.py
│ ├── operator.py
│ └── risk_analyst.py
├── council.py # Council orchestrator
├── synthesis.py # Synthesis agent
├── state.py # State management
├── config.py # Configuration module
├── main.py # CLI entry point
├── requirements.txt # Python dependencies
├── setup.py # Package setup
├── pyproject.toml # Modern Python packaging
├── LICENSE # MIT License
└── README.md # This file
council.py: Main orchestrator managing agent interactionsagents/: Individual agent implementations with distinct personassynthesis.py: Meta-agent that synthesizes all responsesstate.py: State management for conversation trackingconfig.py: Centralized configuration managementmain.py: CLI interface with rich formatting
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
This project is completely free because:
- Ollama provides free, local LLM inference - no API costs
- LangChain is an open-source framework - no licensing fees
- All processing is local - no cloud services or subscriptions needed
- Open source - MIT licensed, you own the code
Unlike commercial AI services that charge per API call, this system runs entirely on your hardware using free, open-source models.
