Skip to content

PDF Q&A platform with Go backend, RAG pipeline, and Ollama integration

Notifications You must be signed in to change notification settings

Zedel17/pdf-qa-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PDF Q&A Platform

A full-stack web application for uploading PDF documents and asking questions about their contents using natural language processing and RAG (Retrieval-Augmented Generation).

Features

  • 📄 PDF upload with drag & drop support
  • 💬 Chat-style Q&A interface
  • 🤖 Local LLM integration (Ollama with Llama 3.2 3B)
  • 🔍 RAG pipeline with keyword-based retrieval
  • ⚡ Real-time answers based on document content
  • 🎨 Modern UI with Tailwind CSS

Tech Stack

Backend (/backend)

  • Language: Go 1.21+
  • Framework: Gin
  • PDF Processing: pdfcpu
  • LLM: Ollama (Llama 3.2 3B)
  • Storage: Local filesystem

Frontend (/frontend)

  • Framework: React 19
  • Language: TypeScript
  • Build Tool: Vite
  • Styling: Tailwind CSS 3.4
  • HTTP Client: Fetch API

Project Structure

pdf-qa-app/
├── backend/                 # Go backend
│   ├── cmd/server/         # Application entry point
│   ├── internal/
│   │   ├── api/           # HTTP handlers
│   │   ├── pdf/           # PDF text extraction
│   │   ├── rag/           # RAG pipeline
│   │   └── storage/       # File storage
│   ├── uploads/           # Uploaded PDFs and extracted text
│   └── go.mod
│
├── frontend/               # React frontend
│   ├── src/
│   │   ├── components/    # React components
│   │   ├── api/          # Backend API client
│   │   └── types/        # TypeScript interfaces
│   └── package.json
│
└── README.md

Quick Start

Already have everything installed? Start both servers with one command:

./start.sh

Then open http://localhost:5173 in your browser.

Press Ctrl+C to stop both servers.


Setup & Installation

Prerequisites

  1. Go 1.21 or higher
  2. Node.js 18+ and npm
  3. Ollama with Llama 3.2 3B model

Install Ollama and Model

# Install Ollama (macOS)
brew install ollama

# Start Ollama service
ollama serve

# Pull the model (in another terminal)
ollama pull llama3.2:3b

Backend Setup

# Navigate to backend directory
cd backend

# Install dependencies
go mod download

# Run the server
go run cmd/server/main.go

Backend will start on http://localhost:8080

Frontend Setup

# Navigate to frontend directory
cd frontend

# Install dependencies
npm install

# Start development server
npm run dev

Frontend will start on http://localhost:5173

Usage

  1. Start both servers: Run ./start.sh (or manually start backend on :8080, frontend on :5173)
  2. Open browser to http://localhost:5173
  3. Upload a PDF by dragging & dropping or clicking "Choose File"
  4. Ask questions about the document in the chat interface
  5. Get answers powered by local LLM based on document content

API Endpoints

Backend API (http://localhost:8080)

  • GET /health - Health check
  • POST /upload - Upload PDF file
  • POST /query - Ask question about uploaded document

See /backend/README.md for detailed API documentation.

Development

Backend

cd backend
go run cmd/server/main.go

Frontend

cd frontend
npm run dev        # Development server
npm run build      # Production build
npm run lint       # Run ESLint

How It Works

  1. Upload: User uploads PDF → Backend extracts text using pdfcpu
  2. Storage: Text is stored with unique document ID
  3. Query: User asks question → RAG pipeline retrieves relevant chunks
  4. Generation: Ollama LLM generates answer based on retrieved context
  5. Display: Answer shown in chat interface

Future Enhancements

  • Vector embeddings for semantic search
  • Multiple document support
  • Document history and management
  • Export chat conversations
  • Docker Compose setup
  • User authentication
  • Database persistence
  • Unit and integration tests

License

MIT

Author

Federico Mercurio

About

PDF Q&A platform with Go backend, RAG pipeline, and Ollama integration

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published