Skip to content

Starvinci/LLM-Fact-Checker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM Fact Checker

Overview

The LLM Fact Checker is designed to help verify the accuracy of information on the Internet. The application uses OpenAI's LLMs with the Tavily search engine to find references based on already published sources. The goal is to quickly cross-check statements, thereby facilitating the detection of disinformation. Which media portals are considered trustworthy and which should be ignored for reference searches can be configured via the extension's settings page.

Demo

Screenshots

Screenshot 1 Screenshot 2 Screenshot 3 Screenshot 4 Screenshot 5 Settings

Demo Videos

Demo 1 Demo 2 Demo 3

Current Issues

  • Buffer window for audio processing is partially faulty
  • Video player detection (except YouTube) is unreliable
  • Preload time is still too long – should reference searches be parallelized?
  • Source configuration only partially implemented

Tech Stack

  • Backend: Flask
  • LLM Service: OpenAI
  • Containerization: Docker, Docker Compose

Running the Application

To try out the extension, start the backend and add the extension to Chrome.

docker-compose up

Additionally, a .env file with API keys must be created in the backend/ directory:

# backend/.env
OPENAI_API_KEY=...   # OpenAI (LLM/Whisper)
TAVILY_API_KEY=...   # Tavily-Suche

About

LLM-based reference search extension for Chrome, shipped with a docker packed flask backend

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published