The LLM Fact Checker is designed to help verify the accuracy of information on the Internet. The application uses OpenAI's LLMs with the Tavily search engine to find references based on already published sources. The goal is to quickly cross-check statements, thereby facilitating the detection of disinformation. Which media portals are considered trustworthy and which should be ignored for reference searches can be configured via the extension's settings page.
Screenshots
Demo Videos
- Buffer window for audio processing is partially faulty
- Video player detection (except YouTube) is unreliable
- Preload time is still too long – should reference searches be parallelized?
- Source configuration only partially implemented
- Backend: Flask
- LLM Service: OpenAI
- Containerization: Docker, Docker Compose
To try out the extension, start the backend and add the extension to Chrome.
docker-compose upAdditionally, a .env file with API keys must be created in the backend/ directory:
# backend/.env
OPENAI_API_KEY=... # OpenAI (LLM/Whisper)
TAVILY_API_KEY=... # Tavily-Suche




