一个使用 LangGraph、FastAPI 和 Streamlit 构建的完整 AI 智能体服务工具包。
该工具包包含一个基于 LangGraph 的智能体、一个用于提供服务的 FastAPI 服务、一个用于与服务交互的客户端,以及一个使用该客户端提供聊天界面的 Streamlit 应用。数据结构和设置使用 Pydantic 构建。
本项目为您提供了一个模板,让您能够轻松地使用 LangGraph 框架构建和运行自己的智能体。它展示了从智能体定义到用户界面的完整设置,通过提供一个完整、健壮的工具包,使得基于 LangGraph 的项目更容易上手。
Run directly in python
# 至少需要一个 LLM API 密钥
echo 'OPENAI_API_KEY=your_openai_api_key' >> .env
# 推荐使用 uv,但也可以使用 "pip install ."
pip install uv
uv sync --frozen
# "uv sync" 会自动创建 .venv
source .venv/bin/activate
python src/run_service.py
# 在另一个终端中
source .venv/bin/activate
streamlit run src/streamlit_app.py以 docker 方式运行
echo 'OPENAI_API_KEY=your_openai_api_key' >> .env
docker compose watch- LangGraph 智能体及最新特性: 使用 LangGraph 框架构建的可定制智能体。实现了 LangGraph v0.3 的最新特性,包括使用
interrupt()的人机交互、使用Command的流程控制以及langgraph-supervisor。 - FastAPI 服务: 提供智能体的流式和非流式端点服务。
- 高级流式传输: 创新的方法支持基于令牌和基于消息的流式传输。
- Streamlit 界面: 提供用户友好的聊天界面与智能体交互。
- 多智能体支持: 在服务中运行多个智能体并通过 URL 路径调用。可用的智能体和模型在
/info中描述。 - 异步设计: 使用 async/await 高效处理并发请求。
- 内容审核: 实现 LlamaGuard 进行内容审核(需要 Groq API 密钥)。
- 反馈机制: 包含与 LangSmith 集成的星级反馈系统。
- Docker 支持: 包含用于轻松开发和部署的 Dockerfile 和 docker compose 文件。
- 测试: 包含完整代码库的健壮单元和集成测试。
代码库结构如下:
src/agents/: 定义具有不同功能的多个智能体src/schema/: 定义协议模式src/core/: 核心模块,包括 LLM 定义和设置src/service/service.py: 用于提供智能体服务的 FastAPI 服务src/client/client.py: 用于与智能体服务交互的客户端src/streamlit_app.py: 提供聊天界面的 Streamlit 应用tests/: 单元和集成测试
-
克隆仓库:
git clone https://github.com/JoshuaC215/agent-service-toolkit.git cd agent-service-toolkit -
Set up environment variables: Create a
.envfile in the root directory. At least one LLM API key or configuration is required. See the.env.examplefile for a full list of available environment variables, including a variety of model provider API keys, header-based authentication, LangSmith tracing, testing and development modes, and OpenWeatherMap API key. -
You can now run the agent service and the Streamlit app locally, either with Docker or just using Python. The Docker setup is recommended for simpler environment setup and immediate reloading of the services when you make changes to your code.
To customize the agent for your own use case:
- Add your new agent to the
src/agentsdirectory. You can copyresearch_assistant.pyorchatbot.pyand modify it to change the agent's behavior and tools. - Import and add your new agent to the
agentsdictionary insrc/agents/agents.py. Your agent can be called by/<your_agent_name>/invokeor/<your_agent_name>/stream. - Adjust the Streamlit interface in
src/streamlit_app.pyto match your agent's capabilities.
This project includes a Docker setup for easy development and deployment. The compose.yaml file defines two services: agent_service and streamlit_app. The Dockerfile for each is in their respective directories.
For local development, we recommend using docker compose watch. This feature allows for a smoother development experience by automatically updating your containers when changes are detected in your source code.
-
Make sure you have Docker and Docker Compose (>=2.23.0) installed on your system.
-
Build and launch the services in watch mode:
docker compose watch
-
The services will now automatically update when you make changes to your code:
- Changes in the relevant python files and directories will trigger updates for the relevantservices.
- NOTE: If you make changes to the
pyproject.tomloruv.lockfiles, you will need to rebuild the services by runningdocker compose up --build.
-
Access the Streamlit app by navigating to
http://localhost:8501in your web browser. -
The agent service API will be available at
http://0.0.0.0:8080. You can also use the OpenAPI docs athttp://0.0.0.0:8080/redoc. -
Use
docker compose downto stop the services.
This setup allows you to develop and test your changes in real-time without manually restarting the services.
The repo includes a generic src/client/client.AgentClient that can be used to interact with the agent service. This client is designed to be flexible and can be used to build other apps on top of the agent. It supports both synchronous and asynchronous invocations, and streaming and non-streaming requests.
See the src/run_client.py file for full examples of how to use the AgentClient. A quick example:
from client import AgentClient
client = AgentClient()
response = client.invoke("Tell me a brief joke?")
response.pretty_print()
# ================================== Ai Message ==================================
#
# A man walked into a library and asked the librarian, "Do you have any books on Pavlov's dogs and Schrödinger's cat?"
# The librarian replied, "It rings a bell, but I'm not sure if it's here or not."The agent supports LangGraph Studio, a new IDE for developing agents in LangGraph.
You can simply install LangGraph Studio, add your .env file to the root directory as described above, and then launch LangGraph studio pointed at the root directory. Customize langgraph.json as needed.
You can also use Ollama to run the LLM powering the agent service.
- Install Ollama using instructions from https://github.com/ollama/ollama
- Install any model you want to use, e.g.
ollama pull llama3.2and set theOLLAMA_MODELenvironment variable to the model you want to use, e.g.OLLAMA_MODEL=llama3.2
If you are running the service locally (e.g. python src/run_service.py), you should be all set!
If you are running the service in Docker, you will also need to:
- Configure the Ollama server as described here, e.g. by running
launchctl setenv OLLAMA_HOST "0.0.0.0"on MacOS and restart Ollama. - Set the
OLLAMA_BASE_URLenvironment variable to the base URL of the Ollama server, e.g.OLLAMA_BASE_URL=http://host.docker.internal:11434 - Alternatively, you can run
ollama/ollamaimage in Docker and use a similar configuration (however it may be slower in some cases).
You can also run the agent service and the Streamlit app locally without Docker, just using a Python virtual environment.
-
Create a virtual environment and install dependencies:
pip install uv uv sync --frozen source .venv/bin/activate -
Run the FastAPI server:
python src/run_service.py
-
In a separate terminal, run the Streamlit app:
streamlit run src/streamlit_app.py
-
Open your browser and navigate to the URL provided by Streamlit (usually
http://localhost:8501).
The following are a few of the public projects that drew code or inspiration from this repo.
- alexrisch/agent-web-kit A Next.JS frontend for agent-service-toolkit
- raushan-in/dapa - Digital Arrest Protection App (DAPA) enables users to report financial scams and frauds efficiently via a user-friendly platform.
Please create a pull request editing the README or open a discussion with any new ones to be added! Would love to include more projects.
Contributions are welcome! Please feel free to submit a Pull Request. Currently the tests need to be run using the local development without Docker setup. To run the tests for the agent service:
-
Ensure you're in the project root directory and have activated your virtual environment.
-
Install the development dependencies and pre-commit hooks:
pip install uv uv sync --frozen pre-commit install
-
Run the tests using pytest:
pytest
This project is licensed under the MIT License - see the LICENSE file for details.

