A robust Python code analysis tool powered by AI that provides intelligent code explanations, quizzes, and structural analysis with automatic fallback between multiple AI providers.
- Multi-Provider AI Support: OpenAI, Anthropic Claude, and Ollama with automatic fallback
- Safe JSON Parsing: Robust error handling and timeout management
- Interactive Web Interface: Built with Streamlit for easy code analysis
- Code Structure Analysis: Automatic detection of functions, classes, imports, and lines
- Study Modes:
- Explanation only
- Explanation with quiz
- Quiz only
- Flexible Input: Paste code directly or upload Python files
- Configurable Settings: Adjustable temperature, detail levels, and custom instructions
- Health Checks: Automatic provider availability detection
ai-code-decoder/
├── app.py # Main Streamlit application
├── requirements.txt # Python dependencies
├── README.md # This file
├── config/
│ └── settings_example.toml # Configuration template
└── core/
├── __init__.py # Package initialization
├── ai_providers.py # AI provider integrations
└── analysis_utils.py # Code analysis utilities
- Python 3.8 or higher
- pip package manager
git clone https://github.com/vinupalackal/ai-code-decoder.git
cd ai-code-decoderpython -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activatepip install -r requirements.txtYou can provide API keys either through the web interface or environment variables:
Environment Variables:
export OPENAI_API_KEY="your_openai_api_key_here"
export ANTHROPIC_API_KEY="your_anthropic_api_key_here"Configuration File (Optional):
cp config/settings_example.toml config/settings.toml
# Edit settings.toml with your preferencesstreamlit run app.pyThe application will start and be available at http://localhost:8501
- Select "openai" from the Provider dropdown
- Enter your API key in the sidebar (or set
OPENAI_API_KEYenvironment variable) - Choose a model (default: gpt-4o-mini)
- Install and start Ollama: https://ollama.ai
- Pull a model:
ollama pull llama3.1 - Select "ollama" from the Provider dropdown
- Ensure Ollama URL is correct (default: http://localhost:11434)
- Select "anthropic" from the Provider dropdown
- Enter your Anthropic API key
- Choose a model (default: claude-3-5-sonnet-latest)
- Input Code: Paste Python code or upload a .py file
- Configure Settings:
- Choose AI provider and model
- Set detail level (low/medium/high)
- Select study mode
- Adjust temperature for creativity
- Review Code Stats: View automatic structural analysis
- Generate Analysis: Click "Run" to get AI-powered explanation
- Low: Brief, high-level overview
- Medium: Balanced explanation with key concepts
- High: Comprehensive, detailed analysis
- Explanation Only: Pure code explanation
- Explanation + Quiz: Explanation followed by comprehension questions
- Quiz Only: Direct quiz generation for testing understanding
- 0.0-0.3: More focused, deterministic responses
- 0.4-0.7: Balanced creativity and consistency
- 0.8-1.0: More creative, varied responses
# Install development dependencies
pip install streamlit
# Run with auto-reload
streamlit run app.py --server.runOnSave trueapp.py: Main Streamlit interface and user interaction logiccore/ai_providers.py: AI provider implementations with fallback logiccore/analysis_utils.py: Code parsing and structural analysisconfig/settings_example.toml: Default configuration template
-
Ollama Connection Failed
- Ensure Ollama is running:
ollama serve - Check if the URL is correct (default: http://localhost:11434)
- Verify you have models installed:
ollama list
- Ensure Ollama is running:
-
OpenAI API Errors
- Verify API key is valid
- Check account usage limits
- Ensure model name is correct
-
File Upload Issues
- Only Python (.py) files are supported
- Files must be UTF-8 encoded
- Check file size limits
- Check the application logs in the terminal
- Verify all dependencies are installed
- Ensure API keys have proper permissions
This project is open source. See the repository for license details.
Contributions are welcome! Please feel free to submit issues and pull requests.
Made with ❤️ for better code understanding