Skip to content

JupiterXue/agents

Repository files navigation

🧰 AI 智能体服务工具箱

build status codecov Python Version GitHub License Streamlit App

一个使用 LangGraph、FastAPI 和 Streamlit 构建的完整 AI 智能体服务工具包。

该工具包包含一个基于 LangGraph 的智能体、一个用于提供服务的 FastAPI 服务、一个用于与服务交互的客户端,以及一个使用该客户端提供聊天界面的 Streamlit 应用。数据结构和设置使用 Pydantic 构建。

本项目为您提供了一个模板,让您能够轻松地使用 LangGraph 框架构建和运行自己的智能体。它展示了从智能体定义到用户界面的完整设置,通过提供一个完整、健壮的工具包,使得基于 LangGraph 的项目更容易上手。

🎥 观看代码仓库和应用的视频演示

概述

快速开始

Run directly in python

# 至少需要一个 LLM API 密钥
echo 'OPENAI_API_KEY=your_openai_api_key' >> .env

# 推荐使用 uv,但也可以使用 "pip install ."
pip install uv
uv sync --frozen
# "uv sync" 会自动创建 .venv
source .venv/bin/activate
python src/run_service.py
# 在另一个终端中
source .venv/bin/activate
streamlit run src/streamlit_app.py

以 docker 方式运行

echo 'OPENAI_API_KEY=your_openai_api_key' >> .env
docker compose watch

Architecture Diagram

### 主要特点
  1. LangGraph 智能体及最新特性: 使用 LangGraph 框架构建的可定制智能体。实现了 LangGraph v0.3 的最新特性,包括使用 interrupt() 的人机交互、使用 Command 的流程控制以及 langgraph-supervisor
  2. FastAPI 服务: 提供智能体的流式和非流式端点服务。
  3. 高级流式传输: 创新的方法支持基于令牌和基于消息的流式传输。
  4. Streamlit 界面: 提供用户友好的聊天界面与智能体交互。
  5. 多智能体支持: 在服务中运行多个智能体并通过 URL 路径调用。可用的智能体和模型在 /info 中描述。
  6. 异步设计: 使用 async/await 高效处理并发请求。
  7. 内容审核: 实现 LlamaGuard 进行内容审核(需要 Groq API 密钥)。
  8. 反馈机制: 包含与 LangSmith 集成的星级反馈系统。
  9. Docker 支持: 包含用于轻松开发和部署的 Dockerfile 和 docker compose 文件。
  10. 测试: 包含完整代码库的健壮单元和集成测试。

关键文件

代码库结构如下:

  • src/agents/: 定义具有不同功能的多个智能体
  • src/schema/: 定义协议模式
  • src/core/: 核心模块,包括 LLM 定义和设置
  • src/service/service.py: 用于提供智能体服务的 FastAPI 服务
  • src/client/client.py: 用于与智能体服务交互的客户端
  • src/streamlit_app.py: 提供聊天界面的 Streamlit 应用
  • tests/: 单元和集成测试

Setup and Usage

  1. 克隆仓库:

    git clone https://github.com/JoshuaC215/agent-service-toolkit.git
    cd agent-service-toolkit
  2. Set up environment variables: Create a .env file in the root directory. At least one LLM API key or configuration is required. See the .env.example file for a full list of available environment variables, including a variety of model provider API keys, header-based authentication, LangSmith tracing, testing and development modes, and OpenWeatherMap API key.

  3. You can now run the agent service and the Streamlit app locally, either with Docker or just using Python. The Docker setup is recommended for simpler environment setup and immediate reloading of the services when you make changes to your code.

Building or customizing your own agent

To customize the agent for your own use case:

  1. Add your new agent to the src/agents directory. You can copy research_assistant.py or chatbot.py and modify it to change the agent's behavior and tools.
  2. Import and add your new agent to the agents dictionary in src/agents/agents.py. Your agent can be called by /<your_agent_name>/invoke or /<your_agent_name>/stream.
  3. Adjust the Streamlit interface in src/streamlit_app.py to match your agent's capabilities.

Docker Setup

This project includes a Docker setup for easy development and deployment. The compose.yaml file defines two services: agent_service and streamlit_app. The Dockerfile for each is in their respective directories.

For local development, we recommend using docker compose watch. This feature allows for a smoother development experience by automatically updating your containers when changes are detected in your source code.

  1. Make sure you have Docker and Docker Compose (>=2.23.0) installed on your system.

  2. Build and launch the services in watch mode:

    docker compose watch
  3. The services will now automatically update when you make changes to your code:

    • Changes in the relevant python files and directories will trigger updates for the relevantservices.
    • NOTE: If you make changes to the pyproject.toml or uv.lock files, you will need to rebuild the services by running docker compose up --build.
  4. Access the Streamlit app by navigating to http://localhost:8501 in your web browser.

  5. The agent service API will be available at http://0.0.0.0:8080. You can also use the OpenAPI docs at http://0.0.0.0:8080/redoc.

  6. Use docker compose down to stop the services.

This setup allows you to develop and test your changes in real-time without manually restarting the services.

Building other apps on the AgentClient

The repo includes a generic src/client/client.AgentClient that can be used to interact with the agent service. This client is designed to be flexible and can be used to build other apps on top of the agent. It supports both synchronous and asynchronous invocations, and streaming and non-streaming requests.

See the src/run_client.py file for full examples of how to use the AgentClient. A quick example:

from client import AgentClient
client = AgentClient()

response = client.invoke("Tell me a brief joke?")
response.pretty_print()
# ================================== Ai Message ==================================
#
# A man walked into a library and asked the librarian, "Do you have any books on Pavlov's dogs and Schrödinger's cat?"
# The librarian replied, "It rings a bell, but I'm not sure if it's here or not."

Development with LangGraph Studio

The agent supports LangGraph Studio, a new IDE for developing agents in LangGraph.

You can simply install LangGraph Studio, add your .env file to the root directory as described above, and then launch LangGraph studio pointed at the root directory. Customize langgraph.json as needed.

Using Ollama

⚠️ Note: Ollama support in agent-service-toolkit is experimental and may not work as expected. The instructions below have been tested using Docker Desktop on a MacBook Pro. Please file an issue for any challenges you encounter.

You can also use Ollama to run the LLM powering the agent service.

  1. Install Ollama using instructions from https://github.com/ollama/ollama
  2. Install any model you want to use, e.g. ollama pull llama3.2 and set the OLLAMA_MODEL environment variable to the model you want to use, e.g. OLLAMA_MODEL=llama3.2

If you are running the service locally (e.g. python src/run_service.py), you should be all set!

If you are running the service in Docker, you will also need to:

  1. Configure the Ollama server as described here, e.g. by running launchctl setenv OLLAMA_HOST "0.0.0.0" on MacOS and restart Ollama.
  2. Set the OLLAMA_BASE_URL environment variable to the base URL of the Ollama server, e.g. OLLAMA_BASE_URL=http://host.docker.internal:11434
  3. Alternatively, you can run ollama/ollama image in Docker and use a similar configuration (however it may be slower in some cases).

Local development without Docker

You can also run the agent service and the Streamlit app locally without Docker, just using a Python virtual environment.

  1. Create a virtual environment and install dependencies:

    pip install uv
    uv sync --frozen
    source .venv/bin/activate
  2. Run the FastAPI server:

    python src/run_service.py
  3. In a separate terminal, run the Streamlit app:

    streamlit run src/streamlit_app.py
  4. Open your browser and navigate to the URL provided by Streamlit (usually http://localhost:8501).

Projects built with or inspired by agent-service-toolkit

The following are a few of the public projects that drew code or inspiration from this repo.

  • alexrisch/agent-web-kit A Next.JS frontend for agent-service-toolkit
  • raushan-in/dapa - Digital Arrest Protection App (DAPA) enables users to report financial scams and frauds efficiently via a user-friendly platform.

Please create a pull request editing the README or open a discussion with any new ones to be added! Would love to include more projects.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. Currently the tests need to be run using the local development without Docker setup. To run the tests for the agent service:

  1. Ensure you're in the project root directory and have activated your virtual environment.

  2. Install the development dependencies and pre-commit hooks:

    pip install uv
    uv sync --frozen
    pre-commit install
  3. Run the tests using pytest:

    pytest

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

基于 LangGraph+FastAPI+Streamlit 开发的 agents

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 16

Languages