Skip to content

API Reference

Janardhan B edited this page May 5, 2025 · 1 revision

API Reference

This page documents the API endpoints provided by the LangGraph Deployment Kit service.

Base URL

All API URLs are relative to your service host, which defaults to:

http://0.0.0.0:8080

You can configure the host and port in your .env file.

Authentication

If AUTH_SECRET is set in your .env file, all API endpoints will require authentication using a Bearer token:

Authorization: Bearer your-auth-secret

Endpoints

Health Check

GET /health

Returns the service status. No authentication required, even if AUTH_SECRET is set.

Response:

{
  "status": "ok"
}

Service Info

GET /info

Returns metadata about the available agents.

Response:

{
  "agents": [
    {
      "key": "research-assistant",
      "description": "A research assistant with web search and calculator."
    }
  ],
  "default_agent": "research-assistant"
}

Invoke Agent

POST /invoke

or

POST /{agent_id}/invoke

Invokes an agent with user input and returns a response.

Request body:

{
  "message": "What is the weather in Tokyo?",
  "model": "gpt-4",
  "thread_id": "optional-thread-id",
  "user_id": "optional-user-id",
  "agent_config": {
    "temperature": 0.7
  }
}

Response:

{
  "type": "ai",
  "content": "The weather in Tokyo is currently 70°F and sunny.",
  "run_id": "847c6285-8fc9-4560-a83f-4e6285809254"
}

Stream Response

POST /stream

or

POST /{agent_id}/stream

Streams an agent's response as a Server-Sent Events (SSE) stream.

Request body:

{
  "message": "What is the weather in Tokyo?",
  "model": "gpt-4",
  "thread_id": "optional-thread-id",
  "user_id": "optional-user-id",
  "stream_tokens": true,
  "stream_node_updates": true,
  "agent_config": {
    "temperature": 0.7
  }
}

Response: A stream of SSE events with the following types:

  • token: Individual tokens from the LLM
  • message: Complete messages from agents
  • node_update: Updates from agent nodes
  • error: Error messages

Example stream:

data: {"type": "token", "content": "The"}
data: {"type": "token", "content": " weather"}
data: {"type": "token", "content": " in"}
data: {"type": "token", "content": " Tokyo"}
data: {"type": "node_update", "content": {"node": "chatbot", "updates": {...}}}
data: {"type": "message", "content": {"type": "ai", "content": "The weather in Tokyo is currently 70°F and sunny."}}
data: [DONE]

Submit Feedback

POST /feedback

Records feedback for a previous interaction using Langfuse.

Request body:

{
  "run_id": "847c6285-8fc9-4560-a83f-4e6285809254",
  "key": "human-feedback-stars",
  "score": 0.8,
  "kwargs": {
    "comment": "Very helpful response!"
  }
}

Response:

{
  "status": "success"
}

Get Chat History

POST /history

Retrieves the conversation history for a thread.

Request body:

{
  "thread_id": "847c6285-8fc9-4560-a83f-4e6285809254"
}

Response:

{
  "messages": [
    {
      "type": "human",
      "content": "What is the weather in Tokyo?"
    },
    {
      "type": "ai",
      "content": "The weather in Tokyo is currently 70°F and sunny."
    }
  ]
}

Clone this wiki locally