Skip to content

RAGify is a retrieval-augmented chat application where users can upload documents, ask questions, and manage their personal knowledge base with conversation history.

License

Notifications You must be signed in to change notification settings

PLeonLopes/RAGify

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RAGify – Chat with Your Documents, Anytime

RAGify is an intelligent retrieval-augmented chat application that lets users upload documents, ask questions about their content, and manage a personal knowledge base with persistent conversation history.

Overview

This project implements the RAG (Retrieval-Augmented Generation) pattern to create a chatbot that answers questions based on a set of documents provided by the user. The application is fully interactive, with a web interface built in Streamlit, and includes a robust backend to manage users, conversations, and each user's vector indexes.


🔄 Application Flow

The diagram below illustrates the flow of the application, from user login to document upload, processing, and interaction with the chat.

Application Flow

✨ Key Features

  • User Authentication: Account Sign Up/Login system to ensure that each user’s data is private and persistent.
  • Multi-format Upload: Supports uploading .pdf, .docx, .xlsx, .csv, .txt, and .md files.
  • User Data Persistence:
    • Chat History: Conversations are saved in a SQLite database and reloaded on each login.
    • Knowledge Base: Uploaded documents are converted into vectors and stored in a dedicated FAISS index for each user.
  • File Management: Users can view and delete previously uploaded files, with the knowledge base updated accordingly.

🛠️ Technologies Used


🚀 Installation and Setup

Follow the steps below to set up and run the project locally.

1. Prerequisites

  • Python 3.12 or higher: Download Python
  • Git: To clone the repository.
  • Ollama: The application uses Ollama to run the Llama 3 model locally.
    • Install Ollama.
    • After installation, pull the llama3 model with:
      ollama pull llama3
    • Make sure Ollama is running before starting the Streamlit app.

2. Clone the Repository

git clone https://github.com/PLeonLopes/RAGify.git
cd RAGify

3. Create a Virtual Environment and Install Dependencies

# Create the virtual environment
python3 -m venv venv

# Activate the virtual environment
venv\Scripts\activate           # <- Windows

source venv/bin/activate        # <- macOS/Linux

Now install all dependencies with:

pip install -r requirements.txt

4. Run the Application

With the virtual environment activated, run Ollama and start the Streamlit app:

Start llama3

ollama serve

Start Streamlit Application

streamlit run src/app.py

The application should automatically open in your default browser.

📖 How to Use

  1. Create an Account: In the sidebar, enter a username and password and click "Create Account".

  2. Login: Use the same credentials to log in.

  3. Upload Files: In the sidebar, select one or more documents and click the "Process Files" button.

  4. Wait for Processing: The application will extract the text, generate embeddings, and save your knowledge base.

  5. Chat: Once the files are processed, go to the main chat area, type your question, and click "Send".

  6. Manage Your Files: In the sidebar, you can view the list of uploaded files and remove any of them by clicking the trash icon.

About

RAGify is a retrieval-augmented chat application where users can upload documents, ask questions, and manage their personal knowledge base with conversation history.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages