AI-powered recipe extraction from videos to Mealie
JarIt is an intelligent application that automatically extracts structured recipes from video content (YouTube, TikTok, Instagram, and more using AI and seamlessly uploads them to your Mealie recipe manager.
- Multi-Platform Support - Extract recipes from YouTube, TikTok, Instagram, and more via yt-dlp
- AI-Powered Extraction - Uses Pydantic AI with support for multiple LLM providers:
- Google Gemini (default)
- OpenAI GPT Models
- Ollama (local models)
- Audio Transcription - Automatic video transcription using OpenAI Whisper
- Smart Recipe Parsing - Extracts ingredients, instructions, timing, and metadata
- Multi-Language - Translate recipes to your preferred language during extraction
- Recipe Editor - Review and edit extracted recipes before uploading
- Mealie Integration - One-click upload to your Mealie instance
- Multi-User Support - User authentication with admin panel
- Secure - JWT authentication, encrypted API keys, role-based access
- Docker Ready - Complete Docker setup for easy deployment
- Docker & Docker Compose
- Mealie instance
- API Keys for LLM provider:
- Google Gemini API key (got the best results so far - free tier available)
- OR OpenAI API key
- OR Ollama installed locally
- OpenAI API key (required for Whisper audio transcription, if recipe is not in the video description)
The fastest way to get started is using Docker:
# 1. Clone the repository
git clone https://github.com/eFroD/recipeAgent
cd recipe-agent
# 2. Copy and configure environment file
cp .env_example .env
nano .env # Edit with your API keys
# 3. Start all services (PostgreSQL, Backend, Frontend)
docker-compose -d
# 4. Access the application
# Frontend: http://localhost
# Backend API docs: http://localhost:8000/docsThat's it! The application should now be running.
You can also use the docker-compose.yml directly:
version: '3.8'
services:
postgres:
image: postgres:15-alpine
environment:
- POSTGRES_DB=${POSTGRES_DB}
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
volumes:
- postgres_data:/var/lib/postgresql/data
networks:
- recipe-network
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]
interval: 10s
timeout: 5s
retries: 5
restart: unless-stopped
backend:
image: ghcr.io/efrod/jarit/backend:latest
expose:
- "8000"
environment:
- PYTHONPATH=/app
- DATABASE_URL=${DATABASE_URL}
- LOGFIRE_WRITE_TOKEN=${LOGFIRE_WRITE_TOKEN:-}
- LLM_PROVIDER=${LLM_PROVIDER}
- MODEL_NAME=${MODEL_NAME}
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- OLLAMA_URL=${OLLAMA_URL:-}
- ALLOW_REGISTRATION=${ALLOW_REGISTRATION:-false}
- ACCESS_TOKEN_EXPIRE_MINUTES=${ACCESS_TOKEN_EXPIRE_MINUTES:-30}
- SECRET_KEY=${SECRET_KEY}
- ALGORITHM=${ALGORITHM:-HS256}
command: uv run uvicorn main:app --host 0.0.0.0 --port 8000
depends_on:
postgres:
condition: service_healthy
networks:
- recipe-network
restart: unless-stopped
frontend:
image: ghcr.io/efrod/jarit/frontend:latest
ports:
- "80:80"
depends_on:
- backend
networks:
- recipe-network
restart: unless-stopped
networks:
recipe-network:
driver: bridge
volumes:
postgres_data:
But then you will also have to add the .env file:
# Settings and API-Keys for Models to use
# Note OPENAI_API_KEY is mandatory, as it powers whisper
# Get the model names and provider names from the pydantic AI documentation.
LLM_PROVIDER=google
MODEL_NAME=gemini-2.5-flash
GOOGLE_API_KEY=YOUR-KEY-HERE
OPENAI_API_KEY=YOUR-KEY-HERE
# JWT settings
# Allow users to create new accounts. Set it to false if you want full control over who can access the app.
# If false, the first user created will be an admin user who can create other users.
ALLOW_REGISTRATION=false
ACCESS_TOKEN_EXPIRE_MINUTES=30
SECRET_KEY=CHANGEME
ALGORITHM=HS256
# PostgreSQL settings - Change credentials
POSTGRES_DB=devdb
POSTGRES_USER=devuser
POSTGRES_PASSWORD=devpassword
DATABASE_URL=postgresql://devuser:devpassword@postgres:5432/devdb
# Monitoring (optional but useful for debugging)
LOGFIRE_WRITE_TOKEN=LOGFIRE_TOKEN
Follow the instruction from the quick start but use the docker-compose.dev.yml file:
# 1. Clone the repository
git clone https://github.com/eFroD/jarit
cd jarit
# 2. Copy and configure environment file
cp .env_example .env
nano .env # Edit with your API keys
# 3. Start all services (PostgreSQL, Backend, Frontend)
docker compose -f docker-compose.dev.yml up --build -d
# 4. Access the application
# Frontend: http://localhost
# Backend API docs: http://localhost:8000/docsFor local development with hot reload:
# 1. Install uv package manager (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# 2. Navigate to project root
cd jarit
# 3. Install Python dependencies
uv sync
# 4. Start PostgreSQL (using Docker)
docker-compose up postgres -d
# 5. Configure environment
cp .env_example .env
nano .env # Add your API keys
# 6. Run backend with hot reload
uv run uvicorn main:app --host 0.0.0.0 --port 8000 --reload --env-file .env# 1. Navigate to frontend directory
cd frontend
# 2. Install dependencies
npm install
# 3. Configure environment
echo "VITE_API_BASE=http://localhost:8000/api/v1" > .env.local
# 4. Start development server
npm run dev
# Frontend will be available at http://localhost:5173Edit the .env file in the project root:
# LLM Provider Configuration
LLM_PROVIDER=google # Options: google, openai, ollama
MODEL_NAME=gemini-2.5-flash # Model to use for extraction
GOOGLE_API_KEY=your_key_here # Required if using Google
OPENAI_API_KEY=your_key_here # Required for Whisper (always) and GPT models
# Authentication Settings
ALLOW_REGISTRATION=true # Enable/disable public registration
ACCESS_TOKEN_EXPIRE_MINUTES=30 # JWT token expiration
SECRET_KEY=your_secret_key
ALGORITHM=HS256
POSTGRES_DB=devdb
POSTGRES_USER=devuser
POSTGRES_PASSWORD=devpassword
DATABASE_URL=postgresql://devuser:devpassword@postgres:5432/devdb
VITE_API_BASE=http://localhost:8000/api/v1Google Gemini offers a free tier:
- Visit Google AI Studio
- Sign in with your Google account
- Click "Get API Key" or "Create API Key"
- Copy the generated API key
- Add to
.env:GOOGLE_API_KEY=your_key_here
Required for Whisper transcription (mandatory) and optional for GPT models:
- Visit OpenAI API Keys
- Sign in or create an account
- Click "Create new secret key"
- Name your key
- Copy the API key
- Add to
.env:OPENAI_API_KEY=your_key_here
Note: OpenAI requires payment setup. Whisper API is very affordable (~$0.006/minute).
To upload recipes to Mealie, you need to generate an API token:
Open your Mealie instance in a web browser (e.g., http://your-mealie-instance:9000)
Log in with your Mealie credentials
- Click on your profile icon (usually in the top-right corner)
- Select "Profile" from the dropdown menu
- In your profile page, look for the "API Tokens" section
- Click on "API Tokens" or scroll down to the tokens section
- Click the "Create API Token" or "Generate Token" button
- Enter a name for your token.
- Set an expiration date (optional - "Never" is recommended for convenience)
- Click "Create" or "Generate"
- Your new API token will be displayed (usually only once!)
- Copy the entire token - it should look something like:
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJ1c2VyQGV4YW1wbGUuY29tIiw... - Store it securely - you won't be able to see it again!
- Log in to JarIt
- Navigate to your Dashboard
- Scroll to the "Mealie Configuration" section
- Enter your Mealie details:
- Base URL: Your Mealie instance URL (e.g.,
http://192.168.1.100:9000) - API Key: Paste the token you just copied
- Base URL: Your Mealie instance URL (e.g.,
- Click "Save" or "Test Connection"
- Access the application at
http://localhost(Docker) orhttp://localhost:5173(dev) - Register the first user - they will automatically become an admin
- Set up Mealie integration in the dashboard (see Obtaining Mealie API Key)
- (Optional) Disable registration - Set
ALLOW_REGISTRATION=falsein.envfor security
- Navigate to Dashboard
- Paste a video URL (YouTube, TikTok, Instagram, etc.)
- Select target language (optional - defaults to English)
- Click "Extract Recipe"
- Wait for AI to process (10-60 seconds depending on video length)
- Review and edit the extracted recipe
- Upload to Mealie with one click!
Admins can manage users:
- Navigate to Admin Panel
- View all users and their roles
- Create new users manually
- Delete users
I think the app has much more potential, than what it can do now. Check the Issues if you want to see what is going on. I also think of the following enhancements:
- Integrate new recipe databases beyond mealie. The app is designed in a way, that you can write your own integrator. The database can store multiple api keys, identified by a service name.
- Support for more LLM providers and better self-hosted model support. I did not test any of the self-hosted functionalities yet. Any insight is greatly appreciated.
- And more! If you have any Idea, feel free to post an issue.
