https://tijo-langchat.streamlit.app/
This project is a chatbot application built using FastAPI, Streamlit, and LangGraph. The chatbot interacts with users through a web interface and processes their messages using a LangGraph-based agent.
Agentic_Langraph_chatbot/
├── app.py # Main file with fast api backend
├── Dockerfile # DockerFile
├── README.md # Unit and integration tests
├── requirements.txt # Dependencies
└── ui.py # Streamlit file for Frond end
.env(Please add your api keys here, I didn't) # Environment variable template- FastAPI: A modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.
- Streamlit: An open-source app framework for Machine Learning and Data Science teams.
- LangGraph: A library for building and running language models.
- Uvicorn: A lightning-fast ASGI server implementation, using
uvloopandhttptools.
Follow these steps to set up and run the project:
-
Create a Conda Environment:
conda create -n langchat python=3.11 -y
-
Activate the Conda Environment:
conda activate langchat
-
Install the Required Dependencies:
pip install -r requirements.txt
-
Run the FastAPI and Streamlit Servers:
uvicorn app:app --reload streamlit run ui.py
Alternatively, you can use Docker to run the project:
-
Build the Docker Image:
docker build -t langraph_chatbot . -
Run the Docker Container:
docker run -p 8000:8000 -p 8501:8501 langraph_chatbot
- Open your web browser and navigate to
http://localhost:8501. - Define your AI agent by entering a system prompt.
- Select a model from the dropdown menu.
- Enter your message(s) and click the "Submit" button.
- View the agent's response on the web interface.
- Ensure that the FastAPI server is running on
http://127.0.0.1:8000as the Streamlit app communicates with this endpoint. - The Dockerfile is configured to expose ports 8000 and 8501 for FastAPI and Streamlit, respectively.
This project is licensed under the MIT License.