π Transform your news consumption with AI-powered critical analysis. Think critically, verify facts, and understand the world beyond headlines.
A modern web-based news analysis platform with AI-powered critical analysis, built with Flask, MongoDB, and HTMX.
- π° RSS Feed Aggregation: Automatically fetch news from multiple RSS sources
- π§ AI-Powered Analysis: Critical analysis using OpenAI, Anthropic, or local Ollama
- π Web Interface: Modern UI with Tabler CSS and HTMX for dynamic interactions
- π MongoDB Storage: Persistent storage for articles and analysis results
- π§ Modular Architecture: Blueprint-based Flask application
- π³ Docker Support: Easy deployment with Docker and docker-compose
- π MCP Ready: Basic structure and UI ready for future Model Context Protocol integration
- Backend: Python 3.12, Flask 3
- Database: MongoDB (pymongo)
- UI: Tabler CSS (via CDN) + HTMX (zero build)
- AI Providers: OpenAI, Anthropic, Ollama
- Scraping: BeautifulSoup4, markdownify
- RSS: feedparser
- Containerization: Docker + docker-compose
- Docker & Docker Compose
- Ollama running locally (
ollama serve) - MongoDB (optional - can use Docker)
git clone <your-repo>
cd news_agent_web
docker-compose up news-agent-standalone --buildGo to http://localhost:8080
git clone <your-repo>
cd news_agent_web
# Modify docker-compose.yml: uncomment the mongodb section
docker-compose --profile full up --builddocker-compose --profile mongodb-only up -d # Only MongoDB
docker-compose up news-agent-standalone --build # Then the app-
Install Ollama:
# macOS/Linux curl -fsSL https://ollama.ai/install.sh | sh # Windows # Download from https://ollama.ai/download
-
Start Ollama service:
ollama serve
-
Download a model (optional but recommended):
ollama pull llama2:7b # or ollama pull mistral:7b
- Get your key: https://platform.openai.com/api-keys
- Cost: ~$0.01-0.03 per 1K tokens
- Models: GPT-4, GPT-3.5-turbo
- Get your key: https://console.anthropic.com/
- Cost: ~$0.0025-0.015 per 1K tokens
- Models: Claude 3 Haiku, Sonnet, Opus
- Get your key: https://www.scrapingdog.com/
- Cost: $29/month for 1,000 requests
- Why required: Enables web search and fact verification
- Alternative: Free tier available for testing
Note: All API keys are managed through the web interface and stored securely in MongoDB.
- Python 3.12+
- MongoDB (or Docker)
- Ollama (optional, for local AI)
-
Clone the repository
git clone <repository-url> cd news_agent_web
-
Install dependencies
pip install -r requirements.txt
-
Start MongoDB (if not using Docker)
# Install and start MongoDB locally # Or use Docker: docker run -d -p 27017:27017 mongo:7.0
-
Run the application
python run.py
The application will be available at
http://localhost:8080
-
Build and run with docker-compose
docker-compose up --build
-
Access the application
- Web UI:
http://localhost:8080 - MongoDB:
localhost:27017
- Web UI:
All API keys are managed through the web interface and stored securely in the MongoDB database:
- Go to Settings (
/settings) in the web interface - Configure your API keys for the services you want to use
- Test the connections to ensure everything works
- Keys are encrypted and stored securely in the database
- π’ Required: Ollama (for local AI), MongoDB (database)
- π‘ Recommended: OpenAI/Claude (for better AI analysis)
- π΄ Required for verification: ScrapingDog (enables web search and fact-checking)
The application uses sensible defaults for local development. For production deployment, you can configure these environment variables:
# Flask Configuration
FLASK_APP=app
FLASK_ENV=production
SECRET_KEY=your-production-secret-key
# MongoDB Configuration
MONGO_URI=mongodb://your-mongodb-server:27017/news_agent_web
# Ollama Configuration
OLLAMA_BASE_URL=http://your-ollama-server:11434- Navigate to
/newsto view aggregated articles - Click "Refresh" to fetch fresh news from RSS feeds
- Filter by language (IT/EN)
- View article details and perform analysis
- Article Analysis: Click "Analizza" on any article
- Text Analysis: Use the "Analizza Testo" feature for custom content
- URL Analysis: Analyze articles directly from URLs
- View detailed analysis results with credibility scores
- Configure AI providers and models through the web interface
- Set language preferences
- Manage RSS sources
- Configure and test AI provider connections
- API Keys: Securely stored in MongoDB database, managed via web UI
- Access MCP status at
/mcp β οΈ Currently in Development: Filesystem, HTTP, and search adapters are stub implementations- π Ready for Integration: Full MCP (Model Context Protocol) support planned for future releases
- π Current Status: Basic structure and UI ready, core functionality to be implemented
GET /news/api/articles- Get articlesGET /news/api/sources- Get available sourcesGET /news/fetch- Fetch fresh newsGET /news/api/articles/<id>- Get specific article
POST /analysis/api/analyze- Analyze articlePOST /analysis/api/analyze-text- Analyze custom textPOST /analysis/api/analyze-url- Analyze URLGET /analysis/api/analyses- Get analysis history
GET /settings/api/settings- Get current settingsPOST /settings/api/settings- Update settingsGET /settings/api/providers- Get AI providersPOST /settings/api/test-provider- Test provider
GET /mcp/api/status- Get MCP statusPOST /mcp/api/filesystem- Filesystem operations (β οΈ Stub - not yet implemented)POST /mcp/api/http- HTTP operations (β οΈ Stub - not yet implemented)POST /mcp/api/search- Search operations (β οΈ Stub - not yet implemented)
news_agent_web/
βββ app/
β βββ blueprints/ # Flask blueprints
β β βββ news.py # News management
β β βββ analysis.py # Analysis features
β β βββ settings.py # Configuration
β β βββ mcp.py # MCP adapter (stub implementation)
β βββ models/ # MongoDB models
β β βββ article.py # Article model
β β βββ analysis.py # Analysis model
β β βββ settings.py # Settings model
β βββ services/ # Business logic
β β βββ news_service.py # RSS aggregation
β β βββ ai_service.py # AI providers
β β βββ analysis_service.py # Critical analysis
β β βββ scraping_service.py # Web scraping
β βββ templates/ # Jinja2 templates
β βββ static/ # Static assets
βββ docker-compose.yml # Docker configuration
βββ Dockerfile # Container definition
βββ requirements.txt # Python dependencies
βββ run.py # Application entry point
The application follows a modular architecture:
- Blueprints: Separate Flask modules for different features
- Models: MongoDB document models with validation
- Services: Business logic and external integrations
- Templates: Jinja2 templates with Tabler CSS
- Static: CSS, JS, and image assets
- Create a new blueprint in
app/blueprints/ - Add models if needed in
app/models/ - Implement services in
app/services/ - Create templates in
app/templates/ - Register blueprint in
app/__init__.py
# Run tests (when implemented)
python -m pytest
# Run with coverage
python -m pytest --cov=app- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes and commit them (
git commit -m 'Add amazing feature') - Add tests if applicable
- Push to the branch (
git push origin feature/amazing-feature) - Submit a pull request
- Follow PEP 8 style guidelines
- Add docstrings to new functions
- Test your changes locally before submitting
- Update documentation if needed
This project is licensed under the MIT License - see the LICENSE file for details.
- Tabler: Modern UI components and design system
- HTMX: Dynamic web interactions without JavaScript
- Flask: Python web framework
- MongoDB: NoSQL database
- Ollama: Local AI inference engine
- OpenAI & Anthropic: Cloud AI services
- ScrapingDog: Web scraping and search API
- news_agent - CLI version with Rich terminal interface
- news_agent_web - Web version with Flask and modern browser interface




