Skip to content

vinupalackal/ai-code-decoder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🧠 AI Code Decoder (Reliable Edition)

A robust Python code analysis tool powered by AI that provides intelligent code explanations, quizzes, and structural analysis with automatic fallback between multiple AI providers.

✨ Features

  • Multi-Provider AI Support: OpenAI, Anthropic Claude, and Ollama with automatic fallback
  • Safe JSON Parsing: Robust error handling and timeout management
  • Interactive Web Interface: Built with Streamlit for easy code analysis
  • Code Structure Analysis: Automatic detection of functions, classes, imports, and lines
  • Study Modes:
    • Explanation only
    • Explanation with quiz
    • Quiz only
  • Flexible Input: Paste code directly or upload Python files
  • Configurable Settings: Adjustable temperature, detail levels, and custom instructions
  • Health Checks: Automatic provider availability detection

🏗️ Project Structure

ai-code-decoder/
├── app.py                    # Main Streamlit application
├── requirements.txt          # Python dependencies
├── README.md                # This file
├── config/
│   └── settings_example.toml # Configuration template
└── core/
    ├── __init__.py          # Package initialization
    ├── ai_providers.py      # AI provider integrations
    └── analysis_utils.py    # Code analysis utilities

🚀 Installation & Setup

Prerequisites

  • Python 3.8 or higher
  • pip package manager

1. Clone the Repository

git clone https://github.com/vinupalackal/ai-code-decoder.git
cd ai-code-decoder

2. Create Virtual Environment (Recommended)

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

3. Install Dependencies

pip install -r requirements.txt

4. Configure API Keys (Optional)

You can provide API keys either through the web interface or environment variables:

Environment Variables:

export OPENAI_API_KEY="your_openai_api_key_here"
export ANTHROPIC_API_KEY="your_anthropic_api_key_here"

Configuration File (Optional):

cp config/settings_example.toml config/settings.toml
# Edit settings.toml with your preferences

🎯 Usage

Running the Application

streamlit run app.py

The application will start and be available at http://localhost:8501

Using Different AI Providers

OpenAI

  1. Select "openai" from the Provider dropdown
  2. Enter your API key in the sidebar (or set OPENAI_API_KEY environment variable)
  3. Choose a model (default: gpt-4o-mini)

Ollama (Local AI)

  1. Install and start Ollama: https://ollama.ai
  2. Pull a model: ollama pull llama3.1
  3. Select "ollama" from the Provider dropdown
  4. Ensure Ollama URL is correct (default: http://localhost:11434)

Anthropic Claude

  1. Select "anthropic" from the Provider dropdown
  2. Enter your Anthropic API key
  3. Choose a model (default: claude-3-5-sonnet-latest)

Analysis Workflow

  1. Input Code: Paste Python code or upload a .py file
  2. Configure Settings:
    • Choose AI provider and model
    • Set detail level (low/medium/high)
    • Select study mode
    • Adjust temperature for creativity
  3. Review Code Stats: View automatic structural analysis
  4. Generate Analysis: Click "Run" to get AI-powered explanation

🔧 Configuration Options

Detail Levels

  • Low: Brief, high-level overview
  • Medium: Balanced explanation with key concepts
  • High: Comprehensive, detailed analysis

Study Modes

  • Explanation Only: Pure code explanation
  • Explanation + Quiz: Explanation followed by comprehension questions
  • Quiz Only: Direct quiz generation for testing understanding

Temperature Settings

  • 0.0-0.3: More focused, deterministic responses
  • 0.4-0.7: Balanced creativity and consistency
  • 0.8-1.0: More creative, varied responses

🛠️ Development

Running in Development Mode

# Install development dependencies
pip install streamlit

# Run with auto-reload
streamlit run app.py --server.runOnSave true

Code Structure

  • app.py: Main Streamlit interface and user interaction logic
  • core/ai_providers.py: AI provider implementations with fallback logic
  • core/analysis_utils.py: Code parsing and structural analysis
  • config/settings_example.toml: Default configuration template

🔍 Troubleshooting

Common Issues

  1. Ollama Connection Failed

    • Ensure Ollama is running: ollama serve
    • Check if the URL is correct (default: http://localhost:11434)
    • Verify you have models installed: ollama list
  2. OpenAI API Errors

    • Verify API key is valid
    • Check account usage limits
    • Ensure model name is correct
  3. File Upload Issues

    • Only Python (.py) files are supported
    • Files must be UTF-8 encoded
    • Check file size limits

Getting Help

  • Check the application logs in the terminal
  • Verify all dependencies are installed
  • Ensure API keys have proper permissions

📝 License

This project is open source. See the repository for license details.

🤝 Contributing

Contributions are welcome! Please feel free to submit issues and pull requests.


Made with ❤️ for better code understanding

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages