Skip to content
This repository was archived by the owner on Jan 26, 2026. It is now read-only.

tentou-tech/polymarket-parser

Repository files navigation

Polymarket Parser

This project is a robust data ingestion service designed to parse trade events and crawl market data from Polymarket. It utilizes BullMQ for job processing, PostgreSQL for data persistence, and Docker for containerization.

Features

  • Trade Parser: Fetches and parses trade events from the Polymarket Subgraph.
  • Market Crawler: Crawls detailed market information from the Polymarket Gamma API.
  • Reliable Queues: Uses Redis-backed BullMQ queues (trade_parsing_queue, market_crawling_queue) for scalable and resilient job processing.
  • State Management: Tracks processing progress using database checkpoints to ensure no data is missed or duplicated.
  • Dockerized: Fully containerized setup for easy deployment and development.

Architecture

The system consists of two main background workers:

  1. Trade Worker:

    • Periodically checks for new trade events on the Subgraph.
    • Parses raw event data into structured Trade records.
    • Updates a checkpoint to track the last processed block.
  2. Market Worker:

    • Runs periodically to enrich trade data with market details.
    • Fetches market metadata (like questions, outcomes, closed status) from the Gamma API.
    • Updates the Market table and links trades to markets.
    • Includes logic to ensure it doesn't process blocks ahead of the Trade Parser.

Prerequisites

  • Docker and Docker Compose
  • Node.js (v20+) - for local development

Getting Started

1. Clone the repository

git clone https://github.com/tentou-tech/polymarket-parser.git
cd polymarket-parser

2. Configure Environment Variables

Copy the example environment file and update it with your configuration:

cp .env.example .env

Key variables to check:

  • SUBGRAPH_URL: URL of the Polymarket Subgraph (or local graph-node).
  • DB_*: Database credentials (default matches docker-compose.yml).
  • START_BLOCK_TRADE_PARSER: The block number to start parsing from.
  • TRADE_JOB_INTERVAL_MS: Interval for repeatable jobs.

3. Run with Docker Compose

This is the recommended way to run the application, database, and redis.

docker-compose up -d --build

This command will:

  • Start PostgreSQL on port 5433 (exposed to host).
  • Start Redis on port 6379.
  • Build and start the app container.
  • Automatically run database migrations.

4. Local Development

If you prefer to run the application locally (outside Docker):

  1. Start Dependencies: Use Docker to run only DB and Redis.

    docker-compose up -d db redis
  2. Install Dependencies:

    npm install
  3. Run Migrations:

    npm run migrate
  4. Start the Worker:

    npm run dev
    # OR
    npm run build && npm start

Database Schema

  • trades: Stores individual trade transactions (price, volume, maker, taker, etc.).
  • markets: Stores market metadata (slug, question, outcomes, closed status, etc.).
  • checkpoints: Stores the last processed block number for each job (trade_parser, market_crawler).

Logging

The application uses pino for structured logging. In development (local run), it uses pino-pretty for readable logs. In production (Docker), it outputs JSON logs suitable for ingestion by log management systems.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published