This project is a robust data ingestion service designed to parse trade events and crawl market data from Polymarket. It utilizes BullMQ for job processing, PostgreSQL for data persistence, and Docker for containerization.
- Trade Parser: Fetches and parses trade events from the Polymarket Subgraph.
- Market Crawler: Crawls detailed market information from the Polymarket Gamma API.
- Reliable Queues: Uses Redis-backed BullMQ queues (
trade_parsing_queue,market_crawling_queue) for scalable and resilient job processing. - State Management: Tracks processing progress using database checkpoints to ensure no data is missed or duplicated.
- Dockerized: Fully containerized setup for easy deployment and development.
The system consists of two main background workers:
-
Trade Worker:
- Periodically checks for new trade events on the Subgraph.
- Parses raw event data into structured
Traderecords. - Updates a checkpoint to track the last processed block.
-
Market Worker:
- Runs periodically to enrich trade data with market details.
- Fetches market metadata (like questions, outcomes, closed status) from the Gamma API.
- Updates the
Markettable and links trades to markets. - Includes logic to ensure it doesn't process blocks ahead of the Trade Parser.
- Docker and Docker Compose
- Node.js (v20+) - for local development
git clone https://github.com/tentou-tech/polymarket-parser.git
cd polymarket-parserCopy the example environment file and update it with your configuration:
cp .env.example .envKey variables to check:
SUBGRAPH_URL: URL of the Polymarket Subgraph (or local graph-node).DB_*: Database credentials (default matchesdocker-compose.yml).START_BLOCK_TRADE_PARSER: The block number to start parsing from.TRADE_JOB_INTERVAL_MS: Interval for repeatable jobs.
This is the recommended way to run the application, database, and redis.
docker-compose up -d --buildThis command will:
- Start PostgreSQL on port
5433(exposed to host). - Start Redis on port
6379. - Build and start the
appcontainer. - Automatically run database migrations.
If you prefer to run the application locally (outside Docker):
-
Start Dependencies: Use Docker to run only DB and Redis.
docker-compose up -d db redis
-
Install Dependencies:
npm install
-
Run Migrations:
npm run migrate
-
Start the Worker:
npm run dev # OR npm run build && npm start
trades: Stores individual trade transactions (price, volume, maker, taker, etc.).markets: Stores market metadata (slug, question, outcomes, closed status, etc.).checkpoints: Stores the last processed block number for each job (trade_parser,market_crawler).
The application uses pino for structured logging. In development (local run), it uses pino-pretty for readable logs. In production (Docker), it outputs JSON logs suitable for ingestion by log management systems.