offline-llama-chatbot
A simple offline chatbot built using the Llama 3.2:1B model via Ollama and LangChain.
This project demonstrates how to run a local large language model completely offline — no API keys or internet required.
** Features**
- Works fully offline using Ollama
- Built with Streamlit for an interactive chat UI
- Uses LangChain for prompt management
- Lightweight and beginner-friendly
** Tech Stack**
- LangChain
- Streamlit
- Ollama (Llama 3.2:1B Model)
- Python 3.10+
Setup Instructions
Install Ollama Download and install from -> https://ollama.com/download
Pull the Llama model bash ollama pull llama3.2:1b