Skip to content

Conversation

@Siddharth2207
Copy link
Collaborator

@Siddharth2207 Siddharth2207 commented Sep 26, 2025

Summary

Stox price pusher

Summary by CodeRabbit

  • New Features

    • Replaced BTC/BNB/PYTH public feeds with a broad, uniform catalog of equities and USD pairs.
  • Refactor

    • Startup now awaits an initial price update (new wait-for-first-update API); price freshness threshold relaxed to 24 hours. Added runtime diagnostics and explicit gas-limit handling for transactions.
  • Chores

    • Added platform-aware developer environment, Docker image support, and CI workflow.
  • Tests

    • Added new JSON test payloads.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link

coderabbitai bot commented Sep 26, 2025

Walkthrough

Replaces crypto price entries with many equity configs; awaits an initial Pyth price update at startup (exposes wait API); adds EVM gas-limit selection and extra gas logging; makes devShell platform-aware; adds test data, Dockerfile, and Docker CI workflow.

Changes

Cohort / File(s) Summary
Price config updates
apps/price_pusher/price-config.stable.sample.yaml
Removed BTC/USD, BNB/USD, PYTH/USD. Added many public equity entries (GOOG/USD, AMZN/USD, AMZN/USD.PRE, AMZN/USD.POST, AAPL/USD, MSFT/USD, TSLA/USD, TSLA/USD.PRE, TSLA/USD.POST, NVDA/USD, NVDA/USD.PRE, NVDA/USD.POST, META/USD, GME/USD, MSTR/USD, MSTR/USD.PRE, MSTR/USD.POST, BRKB/USD, SPLG/USD, IAU/USD, COIN/USD, COIN/USD.PRE, COIN/USD.POST, SIVR/USD) using uniform fields (alias, id, time_difference, price_deviation, confidence_ratio, early_update).
Startup sequencing / CLI
apps/price_pusher/src/evm/command.ts
Start Pyth listener earlier, await up to ~15000ms for first price update and log whether update arrived or timed out before creating wallet client/contract.
Pyth listener behavior & API
apps/price_pusher/src/pyth-price-listener.ts
start() now awaits startListening(); added waitForFirstPriceUpdate(timeoutMs?: number): Promise<boolean>; freshness threshold changed from 60s to 24h; added diagnostics/logs and enhanced reconnect/error logging; previously-thrown stale-feed path commented out.
EVM gas handling & logging
apps/price_pusher/src/evm/evm.ts
Added debug logs for initial/final gas price, nonce, and update fee; introduced gasLimitToUse (rounded provided gasLimit or default 1_000_000) and consistently uses gas: gasLimitToUse for simulation and submission.
Dev shell platform deps
flake.nix
Reworked devShells.default.buildInputs into a with pkgs; base list plus platform-specific additions (Linux vs Darwin); replaced python3Packages.distutils with python3Packages.setuptools.
Dockerfile
Dockerfile
New Node 22.14-based Dockerfile with build args GIT_SHA/DOCKER_CHANNEL, sets workdir /price-pusher, copies project, and CMD runs npm start evm with runtime flags (price-config, endpoint, pyth-contract-address, price-service-endpoint, mnemonic-file via process substitution, pushing/polling frequencies, override gas multiplier).
Docker CI workflow
.github/workflows/rainix-docker.yml
New GitHub Actions workflow: checkout (with submodules), install Nix tooling, run pnpm via nix develop, build price-pusher with turbo, configure Docker Buildx/login, build and push image with GIT_SHA and DOCKER_CHANNEL; triggers on push and manual dispatch.
Test data
test.json
Added test JSON containing two documents with binary blobs and parsed entries (id, price, ema_price, metadata) — data-only.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant CLI as CLI / evm command
  participant PL as PythPriceListener
  participant WC as WalletClient & Contract

  CLI->>PL: start() (await startListening())
  CLI->>PL: waitForFirstPriceUpdate(timeout=15000)
  alt first update received
    PL-->>CLI: true (price available)
    CLI->>WC: initialize wallet & contract
  else timeout / no update
    PL-->>CLI: false (timeout)
    CLI->>WC: initialize wallet & contract
  end
Loading
sequenceDiagram
  autonumber
  participant Upd as PriceUpdater
  participant EVM as EVM module
  participant RPC as RPC node

  Upd->>EVM: compute updateFee()
  EVM->>RPC: getGasPrice()
  Note right of EVM `#dceef0`: Log initial gas price
  EVM->>EVM: determine gasLimitToUse (rounded input | default 1_000_000)
  EVM->>RPC: simulate tx (gas=gasLimitToUse)
  Note right of EVM `#fff3d9`: Log nonce, final gas price, update fee, gasLimitToUse
  EVM->>RPC: send tx (gas=gasLimitToUse)
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

  • Focus review on apps/price_pusher/src/pyth-price-listener.ts (wait API, freshness logic, reconnect/error handling).
  • Verify apps/price_pusher/src/evm/evm.ts for correct gasLimit propagation, numeric rounding, and that logs do not leak secrets.
  • Check startup sequencing and timeout behavior in apps/price_pusher/src/evm/command.ts.
  • Validate Dockerfile CMD arguments (mnemonic-file substitution) and CI workflow tag/build-arg handling in .github/workflows/rainix-docker.yml.
  • Inspect flake.nix platform conditionals and replacement of distutils with setuptools.

Pre-merge checks and finishing touches

❌ Failed checks (1 warning, 1 inconclusive)
Check name Status Explanation Resolution
Description check ⚠️ Warning The pull request description is missing critical sections required by the template: Rationale and How has this been tested sections are absent, leaving only a vague Summary. Complete the description by adding a Rationale section explaining why these changes are necessary and a How has this been tested section documenting verification steps.
Title check ❓ Inconclusive The title 'Stox price pusher' is vague and lacks specific context about what changes were made; it reads like a feature name rather than a description of the actual modifications. Replace with a more descriptive title that captures the main change, such as 'Add Stox assets to price pusher configuration and docker setup' or 'Configure price pusher for stock and equity assets.'
✅ Passed checks (1 passed)
Check name Status Explanation
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch 2025-09-26-stox-price-pusher

📜 Recent review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6df13f2 and dd98829.

📒 Files selected for processing (1)
  • apps/price_pusher/price-config.stable.sample.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: build
  • GitHub Check: test
  • GitHub Check: build

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 10

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
apps/price_pusher/src/pyth-price-listener.ts (1)

99-105: Use logger in error handler; ensure single reconnect loop.

-      console.error("Error receiving updates from Hermes:", error);
-      console.error("EventSource readyState:", eventSource.readyState);
+      this.logger.warn({ error, readyState: eventSource.readyState }, "Error receiving updates from Hermes");

Also verify that repeated errors don’t create concurrent listeners; consider guarding reconnection with a flag. Do you want me to propose that guard?

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5794589 and 2cf0f9e.

📒 Files selected for processing (6)
  • apps/price_pusher/price-config.stable.sample.yaml (1 hunks)
  • apps/price_pusher/src/evm/command.ts (1 hunks)
  • apps/price_pusher/src/evm/evm.ts (3 hunks)
  • apps/price_pusher/src/pyth-price-listener.ts (6 hunks)
  • flake.nix (1 hunks)
  • test.json (1 hunks)
🧰 Additional context used
🪛 YAMLlint (1.37.1)
apps/price_pusher/price-config.stable.sample.yaml

[error] 66-66: too many blank lines (2 > 0)

(empty-lines)

🪛 Biome (2.1.2)
test.json

[error] 31-61: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)

🔇 Additional comments (3)
apps/price_pusher/src/evm/evm.ts (1)

232-233: Same gas price unit issue.

-    this.logger.debug(`Final gas price: ${gasPrice} gwei, nonce: ${txNonce}, update fee: ${updateFee} wei`);
+    const gasPriceGweiFinal = gasPrice / 1e9;
+    this.logger.debug(`Final gas price: ${gasPriceGweiFinal} gwei (${gasPrice} wei), nonce: ${txNonce}, update fee: ${updateFee} wei`);
flake.nix (1)

50-74: Nix scoping bug: with pkgs; applies only to the first list.

lib and stdenv are out of scope for the concatenations. This will evaluate with “attribute ‘lib’ missing”. Wrap the whole expression in with pkgs; ( ... ) or qualify as pkgs.lib / pkgs.stdenv.

-  buildInputs =
-    with pkgs; [
-      cli
-      git
-      nodejs
-      pkg-config
-      pnpm
-      pre-commit
-      python3
-      python3Packages.setuptools
-      graphviz
-      anchor
-    ]
-    # Linux-only deps
-    ++ lib.optionals stdenv.isLinux [
-      udev
-      libusb1
-    ]
-    # macOS deps (no udev; use system frameworks)
-    ++ lib.optionals stdenv.isDarwin [
-      libusb1
-      darwin.apple_sdk.frameworks.IOKit
-      darwin.apple_sdk.frameworks.CoreFoundation
-    ];
+  buildInputs = with pkgs; (
+    [
+      cli
+      git
+      nodejs
+      pkg-config
+      pnpm
+      pre-commit
+      python3
+      python3Packages.setuptools
+      graphviz
+      anchor
+    ]
+    # Linux-only deps
+    ++ lib.optionals stdenv.isLinux [
+      udev
+      libusb1
+    ]
+    # macOS deps (no udev; use system frameworks)
+    ++ lib.optionals stdenv.isDarwin [
+      libusb1
+      darwin.apple_sdk.frameworks.IOKit
+      darwin.apple_sdk.frameworks.CoreFoundation
+    ]
+  );

Likely an incorrect or invalid review comment.

apps/price_pusher/price-config.stable.sample.yaml (1)

1-65: No changes needed: price_deviation is defined in percent and the sample YAML comments correctly indicate thresholds of 0.05% and 0.03%.

Comment on lines +153 to +162
// Start the price listener and wait for first update
await pythListener.start();
const receivedUpdate = await pythListener.waitForFirstPriceUpdate(15000);

if (receivedUpdate) {
console.log('Price update received!');
} else {
console.log('Timeout waiting for price update');
}

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Use structured logging and decide behavior on timeout.

Replace console.log with the existing pino logger for consistency, and make the timeout case explicit (warn or exit) to avoid silent degraded startup.

-    if (receivedUpdate) {
-      console.log('Price update received!');
-    } else {
-      console.log('Timeout waiting for price update');
-    }
+    if (receivedUpdate) {
+      logger.info('First Pyth price update received.');
+    } else {
+      logger.warn('Timeout waiting for first Pyth price update after 15s; continuing startup.');
+      // Optionally: return process.exit(1) if startup must fail without data
+    }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Start the price listener and wait for first update
await pythListener.start();
const receivedUpdate = await pythListener.waitForFirstPriceUpdate(15000);
if (receivedUpdate) {
console.log('Price update received!');
} else {
console.log('Timeout waiting for price update');
}
// Start the price listener and wait for first update
await pythListener.start();
const receivedUpdate = await pythListener.waitForFirstPriceUpdate(15000);
if (receivedUpdate) {
logger.info('First Pyth price update received.');
} else {
logger.warn('Timeout waiting for first Pyth price update after 15s; continuing startup.');
// Optionally: return process.exit(1) if startup must fail without data
}
🤖 Prompt for AI Agents
In apps/price_pusher/src/evm/command.ts around lines 153 to 162, replace the
console.log calls with the module's pino logger instance (use the existing
logger variable) and make the timeout explicit: log a success with logger.info
when an update is received, and when waitForFirstPriceUpdate times out log
logger.warn (or logger.error if you prefer failing fast) with context, then
decide behavior—either throw an error or call process.exit(1) to fail startup
instead of silently continuing; update the code accordingly so logging is
consistent and the timeout path is explicit and handled.

Comment on lines +196 to 198

this.logger.debug(`Initial gas price: ${gasPrice} gwei`);

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Gas price unit mismatch in logs.

getGasPrice returns wei; logs label it as gwei. Either log in wei or convert for readability.

-    this.logger.debug(`Initial gas price: ${gasPrice} gwei`);
+    const gasPriceGwei = gasPrice / 1e9;
+    this.logger.debug(`Initial gas price: ${gasPriceGwei} gwei (${gasPrice} wei)`);

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In apps/price_pusher/src/evm/evm.ts around lines 196-198, the logged "Initial
gas price" is labeled as gwei but getGasPrice returns wei; change the log to
either convert the wei value to gwei before logging (e.g., using BigNumber
division by 1e9 or ethers.utils.formatUnits(gasPrice, "gwei")) or update the
label to "wei" if you want to log the raw value, ensuring correct units are
displayed.

Comment on lines +247 to +252
const gasLimitToUse = this.gasLimit !== undefined
? BigInt(Math.ceil(this.gasLimit))
: BigInt(1000000); // Default gas limit of 1M gas units

this.logger.debug(`Using gas limit: ${gasLimitToUse}, gas price: ${gasPrice}, update fee: ${updateFee}`);

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Avoid forcing gas in simulation; apply gas limit on submission.

Providing a fixed gas in simulate can cause false reverts if it’s too low/high. Let simulate estimate gas; set the cap when sending.

-      const gasLimitToUse = this.gasLimit !== undefined
-        ? BigInt(Math.ceil(this.gasLimit))
-        : BigInt(1000000); // Default gas limit of 1M gas units
-      
-      this.logger.debug(`Using gas limit: ${gasLimitToUse}, gas price: ${gasPrice}, update fee: ${updateFee}`);
+      const gasLimitToUse = this.gasLimit !== undefined
+        ? BigInt(Math.ceil(this.gasLimit))
+        : undefined; // Let simulate/chain estimate if not provided
+      
+      this.logger.debug(
+        `Using gas limit: ${gasLimitToUse ?? 'estimate'}, gas price: ${gasPrice}, update fee: ${updateFee}`
+      );

And below:

-          {
-            value: updateFee,
-            gasPrice: BigInt(Math.ceil(gasPrice)),
-            nonce: txNonce,
-            gas: gasLimitToUse,
-          },
+          {
+            value: updateFee,
+            gasPrice: BigInt(Math.ceil(gasPrice)),
+            nonce: txNonce,
+          },
         );
 
-      this.logger.debug({ request }, "Simulated request successfully");
-
-      const hash = await this.client.writeContract(request);
+      const requestWithGas =
+        gasLimitToUse !== undefined ? { ...request, gas: gasLimitToUse } : request;
+      this.logger.debug({ request: requestWithGas }, "Simulated request successfully");
+
+      const hash = await this.client.writeContract(requestWithGas);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const gasLimitToUse = this.gasLimit !== undefined
? BigInt(Math.ceil(this.gasLimit))
: BigInt(1000000); // Default gas limit of 1M gas units
this.logger.debug(`Using gas limit: ${gasLimitToUse}, gas price: ${gasPrice}, update fee: ${updateFee}`);
// let the chain estimate gas when no explicit limit is set
const gasLimitToUse = this.gasLimit !== undefined
? BigInt(Math.ceil(this.gasLimit))
: undefined; // Let simulate/chain estimate if not provided
this.logger.debug(
`Using gas limit: ${gasLimitToUse ?? 'estimate'}, gas price: ${gasPrice}, update fee: ${updateFee}`
);
// … earlier: await this.client.simulateContract(request,
// {
// value: updateFee,
// gasPrice: BigInt(Math.ceil(gasPrice)),
// nonce: txNonce,
{
value: updateFee,
gasPrice: BigInt(Math.ceil(gasPrice)),
nonce: txNonce,
},
);
// apply the cap only on the actual write
const requestWithGas =
gasLimitToUse !== undefined ? { ...request, gas: gasLimitToUse } : request;
this.logger.debug({ request: requestWithGas }, "Simulated request successfully");
const hash = await this.client.writeContract(requestWithGas);
// …
🤖 Prompt for AI Agents
In apps/price_pusher/src/evm/evm.ts around lines 247-252 the code forces a gas
value into the simulation step which can cause false reverts; change the flow so
simulation uses the provider's gas estimation (do not set gas/gasLimit for
simulate), and only apply the computed gasLimitToUse as a cap when constructing
the actual transaction for submission (e.g., pass gasLimitToUse into the
send/submit call or tx overrides). Keep the gasLimit calculation
(BigInt(Math.ceil(...) or default) and the debug log, but remove passing that
value into the simulation API and ensure it is used solely as the override on
the final send.

Comment on lines +38 to 49
await this.startListening();

// Store health check interval reference
this.healthCheckInterval = setInterval(() => {
if (
this.lastUpdated === undefined ||
this.lastUpdated < Date.now() - 30 * 1000
) {
throw new Error("Hermes Price feeds are not updating.");
// throw new Error("Hermes Price feeds are not updating.");
}
}, 5000);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Health check no longer enforces liveness.

Previously this threw if feeds stopped updating; the check is now inert. At least log a warning to surface stuck streams.

-      if (
-        this.lastUpdated === undefined ||
-        this.lastUpdated < Date.now() - 30 * 1000
-      ) {
-        // throw new Error("Hermes Price feeds are not updating.");
-      }
+      if (
+        this.lastUpdated === undefined ||
+        this.lastUpdated < Date.now() - 30 * 1000
+      ) {
+        this.logger.warn("Hermes price feeds are not updating (stale >30s).");
+      }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
await this.startListening();
// Store health check interval reference
this.healthCheckInterval = setInterval(() => {
if (
this.lastUpdated === undefined ||
this.lastUpdated < Date.now() - 30 * 1000
) {
throw new Error("Hermes Price feeds are not updating.");
// throw new Error("Hermes Price feeds are not updating.");
}
}, 5000);
}
await this.startListening();
// Store health check interval reference
this.healthCheckInterval = setInterval(() => {
if (
this.lastUpdated === undefined ||
this.lastUpdated < Date.now() - 30 * 1000
) {
this.logger.warn("Hermes price feeds are not updating (stale >30s).");
}
}, 5000);
}
🤖 Prompt for AI Agents
In apps/price_pusher/src/pyth-price-listener.ts around lines 38 to 49, the
health-check interval currently contains a commented-out throw so it no longer
enforces liveness; restore observability by replacing the commented throw with a
warning log that includes contextual details (e.g., timestamp, this.lastUpdated,
and how stale the feed is). Use the instance logger (this.logger.warn(...)) if
available, otherwise console.warn(...), and keep the 5s interval logic intact so
repeated warnings surface stuck streams; ensure the message is clear and
actionable.

Comment on lines +55 to 56
console.log('Price IDs being requested:', this.priceIds);

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Avoid console.log; use structured logger.

-    console.log('Price IDs being requested:', this.priceIds);
+    this.logger.debug({ priceIds: this.priceIds }, "Requesting price IDs");
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
console.log('Price IDs being requested:', this.priceIds);
this.logger.debug({ priceIds: this.priceIds }, "Requesting price IDs");
🤖 Prompt for AI Agents
In apps/price_pusher/src/pyth-price-listener.ts around lines 55-56, replace the
console.log call with the module's structured logger (e.g., this.logger.info or
the shared logger import) and log the priceIds as a named field instead of
free-form text; for example, call the logger with a message and an object like {
priceIds: this.priceIds } (or JSON.stringify if the logger requires string
payloads). If no logger instance exists in this class, add or inject the
existing application logger and use it consistently for structured logs. Ensure
no sensitive data is logged.

Comment on lines +64 to 65
console.log('EventSource created, waiting for messages...');
eventSource.onmessage = (event: MessageEvent<string>) => {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Avoid console.error; use logger and structured context.

-    console.log('EventSource created, waiting for messages...');
+    this.logger.debug("EventSource created, waiting for messages...");
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
console.log('EventSource created, waiting for messages...');
eventSource.onmessage = (event: MessageEvent<string>) => {
this.logger.debug("EventSource created, waiting for messages...");
eventSource.onmessage = (event: MessageEvent<string>) => {
🤖 Prompt for AI Agents
In apps/price_pusher/src/pyth-price-listener.ts around lines 64-65, replace the
direct console usage with the project's logger: remove console.log('EventSource
created, waiting for messages...') and use logger.info with structured context
(e.g., { component: 'PythPriceListener', event: 'EventSourceCreated' }), and
ensure any console.error usages in this event handler are replaced with
logger.error including structured fields (error, component, event) so messages
and errors are emitted consistently and machine-readable.

Comment on lines +74 to 85
// Consider price to be currently available if it is not older than 24 hours
const currentTime = Date.now() / 1000;
const timeDiff = currentTime - priceUpdate.price.publish_time;
console.log(`Price age check: currentTime=${currentTime}, publishTime=${priceUpdate.price.publish_time}, diff=${timeDiff}s`);

const currentPrice =
Date.now() / 1000 - priceUpdate.price.publish_time > 60
timeDiff > 24 * 60 * 60 // 24 hours in seconds
? undefined
: priceUpdate.price;
if (currentPrice === undefined) {
this.logger.debug("Price is older than 60s, skipping");
console.log(`Price is older than 24 hours (${timeDiff}s), skipping`);
return;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

24h staleness threshold is too permissive; risk of pushing stale prices.

Revert to previous 60s or make it configurable; pushing day‑old data can be dangerous.

-        // Consider price to be currently available if it is not older than 24 hours
-        const currentTime = Date.now() / 1000;
-        const timeDiff = currentTime - priceUpdate.price.publish_time;
-        console.log(`Price age check: currentTime=${currentTime}, publishTime=${priceUpdate.price.publish_time}, diff=${timeDiff}s`);
+        // Consider price available only if not too old (default 60s)
+        const currentTime = Date.now() / 1000;
+        const timeDiff = currentTime - priceUpdate.price.publish_time;
+        this.logger.debug(
+          { currentTime, publishTime: priceUpdate.price.publish_time, ageSec: timeDiff },
+          "Price age check"
+        );
@@
-          timeDiff > 24 * 60 * 60 // 24 hours in seconds
+          timeDiff > 60 // TODO: make configurable
             ? undefined
             : priceUpdate.price;
         if (currentPrice === undefined) {
-          console.log(`Price is older than 24 hours (${timeDiff}s), skipping`);
+          this.logger.debug(`Price is older than threshold (${timeDiff}s), skipping`);
           return;
         }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Consider price to be currently available if it is not older than 24 hours
const currentTime = Date.now() / 1000;
const timeDiff = currentTime - priceUpdate.price.publish_time;
console.log(`Price age check: currentTime=${currentTime}, publishTime=${priceUpdate.price.publish_time}, diff=${timeDiff}s`);
const currentPrice =
Date.now() / 1000 - priceUpdate.price.publish_time > 60
timeDiff > 24 * 60 * 60 // 24 hours in seconds
? undefined
: priceUpdate.price;
if (currentPrice === undefined) {
this.logger.debug("Price is older than 60s, skipping");
console.log(`Price is older than 24 hours (${timeDiff}s), skipping`);
return;
// Consider price available only if not too old (default 60s)
const currentTime = Date.now() / 1000;
const timeDiff = currentTime - priceUpdate.price.publish_time;
this.logger.debug(
{ currentTime, publishTime: priceUpdate.price.publish_time, ageSec: timeDiff },
"Price age check"
);
const currentPrice =
timeDiff > 60 // TODO: make configurable
? undefined
: priceUpdate.price;
if (currentPrice === undefined) {
this.logger.debug(`Price is older than threshold (${timeDiff}s), skipping`);
return;
}
🤖 Prompt for AI Agents
In apps/price_pusher/src/pyth-price-listener.ts around lines 74 to 85, the
current staleness check uses a 24-hour threshold which is too permissive; change
it to use a 60-second default or a configurable environment variable. Replace
the hardcoded 24*60*60 with a constant loaded from configuration or process.env
(e.g., PRICE_STALENESS_SECONDS) defaulting to 60, update the age check to
compare against that constant, and adjust the log messages to reference the
configured threshold; ensure parsing/validation of the env var to an integer and
fall back to 60 if invalid.

Comment on lines +112 to +127
// Wait for the first price update to be received
async waitForFirstPriceUpdate(timeoutMs: number = 10000): Promise<boolean> {
return new Promise((resolve) => {
const startTime = Date.now();

const checkInterval = setInterval(() => {
if (this.latestPriceInfo.size > 0) {
clearInterval(checkInterval);
resolve(true);
} else if (Date.now() - startTime > timeoutMs) {
clearInterval(checkInterval);
resolve(false);
}
}, 100);
});
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Busy‑wait polling for first update; prefer event‑driven resolve.

Set a flag on first message and resolve a Promise; avoid a tight interval.

-  async waitForFirstPriceUpdate(timeoutMs: number = 10000): Promise<boolean> {
-    return new Promise((resolve) => {
-      const startTime = Date.now();
-      
-      const checkInterval = setInterval(() => {
-        if (this.latestPriceInfo.size > 0) {
-          clearInterval(checkInterval);
-          resolve(true);
-        } else if (Date.now() - startTime > timeoutMs) {
-          clearInterval(checkInterval);
-          resolve(false);
-        }
-      }, 100);
-    });
-  }
+  private firstUpdateResolved = false;
+  async waitForFirstPriceUpdate(timeoutMs: number = 10000): Promise<boolean> {
+    if (this.latestPriceInfo.size > 0) return true;
+    return new Promise((resolve) => {
+      const timer = setTimeout(() => resolve(false), timeoutMs);
+      const originalHandler = this.hermesClient; // placeholder to indicate hook context
+      const onFirst = () => {
+        if (this.firstUpdateResolved || this.latestPriceInfo.size === 0) return;
+        this.firstUpdateResolved = true;
+        clearTimeout(timer);
+        resolve(true);
+      };
+      // Minimal change: piggyback on existing onmessage logic by checking map size asynchronously
+      const check = setInterval(() => {
+        if (this.latestPriceInfo.size > 0) {
+          clearInterval(check);
+          clearTimeout(timer);
+          resolve(true);
+        }
+      }, 100);
+    });
+  }

If you prefer, I can wire a dedicated “onFirstUpdate” resolve inside onmessage for a cleaner approach.

Committable suggestion skipped: line range outside the PR's diff.

Comment on lines +1 to +61
{
"binary": {
"encoding": "hex",
"data": [
"504e41550100000003b801000000040d006d55f42cf3492e591c963290374d67ba1a6b9dbb2d3d9db1d207b09f6c2052e44e069b5e7a97a476a1586c8262c60deb9c747b2b865de9070a8772c4e8b7253601033574ed63a91316da19b8f8d71710c942479cac6973741ec5d6726a24c64f9402506041d254617640e390734a79b396d5468e22594c916e816560a6123a202c810104df4bca370b014614929cf36bc470f84cd2aa2ae7b29d62b28c54c23d6d15e48e0fa8e5ca61f5962c350aaa759f4735d6da4b90ac5c308d3b78819260d85388df00060961f7702d0508d8c43459ddc64bdad7fb6ee1b469710ef9655e3196bbec01223c985ef893e6e42590797196adb9f8963eb19f2df93240ed68b99f25dc9bfdf00008ec6ef4749d48fffa58eb6df65aa0b19d82cca5163fb2fba12fda0a36ad7adb902a48d5136967814fcf69556fa15cb108a04096af0a4c62567c17bd1ed6634459010bde8cbebbceddb3e2cefce974230d71fab4acbeb3188e32dfc83dcd59b552945d01286970573b2cbf8aa286bafea423abbfc91a3c684fde1a2b316d4f646cf224010c354486ba5d060728df38ff73e0ca1c65f990565afdecb6d4adde6e7a125bd33e05b832b0f9ce74a7c89de6f3d6749ed8e6f61dff19a7419f3004270fb09dd8ea010d873ace5bfb5dfd923fa68f5511f55e486dd57690c8e11e5b0ed53796402f32f2521810719d8fe691afb6c942bfd94b7f7ec1e263a5e87cc1390a9a6e73268a7d000e4de0e5c8b333ad3fb3fae0d7b87e5e489488662311d46a159d199ef74b141150474d331a96f4e7a5530514745984469389f340db51fb9f4053ef6ed47dc9546a010f5262c00acc1355d87a4e36ad6bba39ac981b5a058fcfe7028cd3e020d33b902e7adad04227e1e02bb220e980e7acf85f28fa12ff323722873569f977cb26f8a10110ff400dddfce35111312cb581d3ec047ce24146f957bf097d21fae4220c369aff5576e28d0d1e35a4c149465f6290e2fefecb258dedcb853baabbc8b16c52a18f0111c4009ffcc484c691f0e0d31fe14c1d3a6c69b50604d28c08ff4ff982a08c242400bec0ba0cbc3f1372445ff0609d6c63517b97fef2d52e969c29f9ee3e9ecffd01120eb9f79e1fae8db4646219bca70d0ae5541614411e808a658c1bb3dbb1339a131cdf9e2cd1b41002fcd9e58bc116f7de3ce53b40595d28b7cabe04b03ce82f860168d649d000000000001ae101faedac5851e32b9b23b5f9411a8c2bac4aae3ed4dd7b811dd1a72ea4aa7100000000098d3cf3014155575600000000000e9added00002710ad13a5eda4ae267120e3b42a8ae131287fd2959c010055006f9cd89ef1b7fd39f667101a91ad578b6c6ace4579d5f7f285a4b06aa4504be60000000000269f150000000000000960fffffffb0000000068d59f550000000068d59f55000000000026e6c60000000000000ba20d6e90a4bf02608315ca7cbe9c093a7eba206a00325feaff95996ff11ad10f8f75c2cbeba5d1b0eeddde828efa594f70caa9f6b08d53a9bffdab25c3895abb474f7e262ce82605f73d9c48e93ed2ac26f8b5f3f484bb51f971db5f89ee7366b13b7eabaac8cc39e5bcfbfa608c101670fd7d0ac03a506c51f353c3306a656b7197d8c0cdc048ca9232f3ff59f74bcd39bdf1ad8b89a4d7be7e753daf78e6afda907578bd5f262d13bbe9d65f7b59e030d7e1a06cf1267040de3eff431846d9cc5be247006832a14670cc2d73014e5bd5cc9e11e67067a04842aaee98f41cac6d4a47124373a5c9567cbd0ff8764de0f221f5154feec719b9e78b449ff5cc4aee35393404f8"
]
},
"parsed": [
{
"id": "6f9cd89ef1b7fd39f667101a91ad578b6c6ace4579d5f7f285a4b06aa4504be6",
"price": {
"price": "2531093",
"conf": "2400",
"expo": -5,
"publish_time": 1758830421
},
"ema_price": {
"price": "2549446",
"conf": "2978",
"expo": -5,
"publish_time": 1758830421
},
"metadata": {
"slot": 245030381,
"proof_available_time": 1758874065,
"prev_publish_time": 1758830421
}
}
]
}

{
"binary": {
"encoding": "hex",
"data": [
"504e41550100000003b801000000040d00a3edd18afde9de270b8b62a927cf70e9fb28e9d20b6aa1c4716a6842568530862b4a4ae3528bf0a6703fe86221d1d2e4ce982ad27cfd8467aa5f385be20e10420103be4d8df0f97dc0e8fa4a1ec326ec147cf76821d4068aba491bf194c26af9f9e86957c739e708f250d0020c2742b0ff7809767cd1987611e1faf3c9fd54d748aa00040345f0e26f1ff242581cdd7143bf3f8cfb4cf2753d95d911da4169f17a6a39635fd39bd3f4dcf7549bd6dd38a911c435dd6612c39fba40c2cb99fcd37919d10d00061d1ab598ff984e59a7719dc6352880db0eb19409bca50904a8158fb15c27b9a2011eed57d6f324aec5e7158d14ff4a03a256c79b14c845a3f3168a443437ca72000850af5644aaf55b6b5710197807faadc386721c15852453a1919fdb0daf17944872aa9bd24a7a321199f041671079ba669be7cc8f588565b6c54dff127d71bfa1010ab4a54e1fe39a61d6a7036f845eec43eb8591016375ee556c9ed67efda89ca9f9501b576233792432eb531ff9b0683eb9557c049086726ad38f424f6210e01a9f000bab25c6ac74948efe2a86a7ae1c6680e2766133a646b54e17fe9305664222317654fd0b1ef0c8e6cc17a4bb9a9cb9ac688866b9f3665f69830555b574854d1326000ceb04d18c726945c1fd5644cf6acce395056c32f2a70ba011f3c9b5ccf28c33106b0c1d05b437b525e321001dfe6a46520b2b837f58c5eef075fc3eb571fa7c19010d98bb15d141785c8741333ccb732e9db40338853ca9aff2fdbbb68209d4aba858286a3da50b27e4390cc47367015e58208756c1903c727cccf0bf69c4d6082a2f000e0f2f11e5acd57d13c532c011489926f871b018ad98148ca1aa208b5aab2e5b583b74f0b18a602829b87d1082b2c4ffd6442db23cb0c5782ee3bfe6633317ad73010f9b66b9e2db932102f085d2c73930de4199f3eb46273c75c3042cdd39f4f03c107477b8778e05f3247315b624dc8f88df2b0d1eb645a762259dd3eab19996c45201104496ce0c3f47fad8d7ef98f3411b2ab113c24f46af0bdb6f0f8f81df9d1194780953f7cbffd10ef848832e52d2230eb5b80b0f435152ce1b01aba9006d5d1f03011115965aa800a3b261b0f5616ebc5be373c427adaaed8df1102eacde9ef9d2c38e115511f1158d0c0b3a79599dae6980059795a9417019c0619c363d5415e4625c0168d657ab00000000001ae101faedac5851e32b9b23b5f9411a8c2bac4aae3ed4dd7b811dd1a72ea4aa7100000000098d5d72014155575600000000000e9afe6c000027106c15651585d9e2509e03410d88ba292e5060b04101005500e62df6c8b4a85fe1a67db44dc12de5db330f7ac66b72dc658afedf0f4a415b43000009f902e3f30000000000b6fa283efffffff80000000068d657ab0000000068d657aa000009f62ce4d8c000000000b4c0439c0d6d8c877b317d842ed23393e73286991414dd17782c8d33bc29d051ef2bdad0ea299dd96ddbe1a83ce9e33ecd96a8484938bf724c3ec71b6130917926d7e9d8808a5cf97249bf74bedfb20552dd86b55f99bd60aabb441f7c8e9fb63e0bc53e4d978ace5f527e8375b892f23dbac6fb0b4504e2a7f05c942e3351cdb9116d98953a838c71f74507071414422f3082440019a194efb837eb1a5403ed3fdb4c6b85b843c56b4398673ed6f7b4fb2c9835ad36e13b134e3d914a33883c98ef2abc01c702bc7dad8e1c53cfd6e0777e5cd528fa5e614f4673c557fb5bfe3c008d66451b61bf157508dd7632cfb7200f2d05a4a8847a3c20597f03f13b30b68fb429cd0e0a96df"
]
},
"parsed": [
{
"id": "e62df6c8b4a85fe1a67db44dc12de5db330f7ac66b72dc658afedf0f4a415b43",
"price": {
"price": "10965100000000",
"conf": "3069847614",
"expo": -8,
"publish_time": 1758877611
},
"ema_price": {
"price": "10952919800000",
"conf": "3032499100",
"expo": -8,
"publish_time": 1758877611
},
"metadata": {
"slot": 245038700,
"proof_available_time": 1758877612,
"prev_publish_time": 1758877610
}
}
]
} No newline at end of file
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Invalid JSON: two top‑level objects.

The file contains two JSON documents back‑to‑back, which is not valid JSON. Wrap them in an array (or split into two files or use .ndjson if that’s intended). Static analysis error confirms parsing fails.

-{
+[
+{
@@
-}
+},
+
+{
@@
-}
+}
+]

If NDJSON was intended, rename to test.ndjson and keep one JSON object per line. Based on static analysis hints.

🧰 Tools
🪛 Biome (2.1.2)

[error] 31-61: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)

🤖 Prompt for AI Agents
In test.json around lines 1-61 there are two top-level JSON objects back-to-back
which makes the file invalid JSON; fix by either wrapping both objects in a JSON
array (insert an opening [ at the start, a comma between the two objects, and a
closing ] at the end) or split them into separate files, or convert to NDJSON by
putting each object on its own line and renaming the file to test.ndjson; ensure
you choose one approach and update any code/config that reads this file
accordingly.

@rouzwelt rouzwelt mentioned this pull request Sep 29, 2025
3 tasks
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 11

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 209e4bc and e0d83cf.

📒 Files selected for processing (2)
  • .github/workflows/rainix-docker.yml (1 hunks)
  • Dockerfile (1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.7)
.github/workflows/rainix-docker.yml

42-42: the runner of "docker/setup-buildx-action@v2" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)


43-43: the runner of "docker/login-action@v2" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)

🪛 Checkov (3.2.334)
Dockerfile

[low] 10-10: Ensure that COPY is used instead of ADD in Dockerfiles

(CKV_DOCKER_4)


[low] 12-12: Ensure that WORKDIR values are absolute paths

(CKV_DOCKER_10)


[low] 1-21: Ensure that HEALTHCHECK instructions have been added to container images

(CKV_DOCKER_2)


[low] 1-21: Ensure that a user for the container has been created

(CKV_DOCKER_3)

🪛 Hadolint (2.14.0)
Dockerfile

[error] 10-10: Use COPY instead of ADD for files and folders

(DL3020)


[error] 12-12: Use absolute WORKDIR

(DL3000)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (4)
  • GitHub Check: price-pusher-image
  • GitHub Check: build
  • GitHub Check: test
  • GitHub Check: build
🔇 Additional comments (2)
.github/workflows/rainix-docker.yml (1)

44-46: Manual verification: ensure Docker Hub secrets are configured
The workflow uses secrets.DOCKERHUB_USERNAME and secrets.DOCKERHUB_TOKEN; confirm these are set in the repository’s GitHub Secrets.

Dockerfile (1)

18-18: Verify process substitution compatibility in CMD
Process substitution <(echo "${MNEMONIC}") requires a bash shell and /proc support in your container at runtime; manually test this in your CI or staging environment or switch to a temporary file or here-doc if unsupported.

required: true

env:
CHANNEL: ${{ inputs.tag || github.head_ref || github.ref_name }}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential tag collision risk with dynamic CHANNEL naming.

The CHANNEL environment variable derives from inputs.tag, github.head_ref, or github.ref_name. For concurrent pushes to different branches, this could lead to race conditions or unintended tag overwrites.

Consider adding a unique suffix or using a more structured tagging strategy:

 env:
-  CHANNEL: ${{ inputs.tag || github.head_ref || github.ref_name }}
+  CHANNEL: ${{ inputs.tag || format('{0}-{1}', github.head_ref || github.ref_name, github.run_number) }}
   DOCKER_BUILDKIT: 1

Or use commit SHA for uniqueness:

env:
  CHANNEL: ${{ inputs.tag || format('{0}-{1}', github.head_ref || github.ref_name, github.sha) }}
🤖 Prompt for AI Agents
.github/workflows/rainix-docker.yml around line 12: the CHANNEL value is derived
from inputs.tag or branch name which can collide across concurrent runs; change
the assignment to append a unique identifier (e.g., short commit SHA or
timestamp) or use a structured format combining branch/tag with github.sha to
guarantee uniqueness and avoid tag overwrites; update the env variable
construction accordingly and ensure downstream steps use the new CHANNEL value
consistently.

Comment on lines +36 to +40
- name: Install deps
run: nix develop -c pnpm install --frozen-lockfile

- name: Build Price Pusher
run: nix develop -c pnpm turbo build --filter @pythnetwork/price-pusher
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Build step alignment with Dockerfile concerns.

The workflow builds the price-pusher package using pnpm turbo build, but the Dockerfile (as currently written) doesn't install dependencies or run build steps. This creates a mismatch where the workflow builds locally, but the Docker image may not have these artifacts available at runtime.

Ensure the Dockerfile includes the necessary build steps (as flagged in the Dockerfile review), or structure the workflow to produce build artifacts that are then copied into a minimal runtime image using a multi-stage build pattern.

🤖 Prompt for AI Agents
.github/workflows/rainix-docker.yml lines 36-40: the CI runs `pnpm turbo build`
for @pythnetwork/price-pusher but the Dockerfile does not install deps or copy
build outputs, causing runtime images to miss artifacts; either update the
Dockerfile to perform dependency install and the package build (add a build
stage that runs pnpm install/build and copies the dist into the final image) or
change the workflow to output the built artifacts (e.g., tar/zip or push to
registry) and use a multi-stage Dockerfile that copies those artifacts into a
minimal runtime image; ensure the chosen approach installs dependencies with
frozen lockfile, produces the same build artifacts as the workflow, and that the
final image contains the built files and any runtime-only deps.

Comment on lines +42 to +43
- uses: docker/setup-buildx-action@v2
- uses: docker/login-action@v2
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Update Docker actions to latest versions.

The workflow uses docker/setup-buildx-action@v2 and docker/login-action@v2, which are outdated and may not run on current GitHub Actions runners.

Apply this diff to update to v3:

-      - uses: docker/setup-buildx-action@v2
-      - uses: docker/login-action@v2
+      - uses: docker/setup-buildx-action@v3
+      - uses: docker/login-action@v3
🧰 Tools
🪛 actionlint (1.7.7)

42-42: the runner of "docker/setup-buildx-action@v2" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)


43-43: the runner of "docker/login-action@v2" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)

🤖 Prompt for AI Agents
In .github/workflows/rainix-docker.yml around lines 42 to 43, the workflow pins
docker/setup-buildx-action@v2 and docker/login-action@v2 which are outdated;
update those action versions to @v3 by replacing the two usages with
docker/setup-buildx-action@v3 and docker/login-action@v3 (and run a quick
workflow dispatch to verify no new required inputs changed).

Comment on lines +47 to +51
- run: |
rm .dockerignore
docker build -t "$TAG_BASE:$CHANNEL" --build-arg GIT_SHA=${{ github.sha }} --build-arg DOCKER_CHANNEL=$CHANNEL .
- run: |
docker push "$TAG_BASE:$CHANNEL"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider using docker/build-push-action for better caching and features.

The workflow manually runs docker build and docker push commands. The docker/build-push-action provides better layer caching, multi-platform support, and BuildKit features.

Replace the manual docker commands:

-      - run: |
-          rm .dockerignore
-          docker build -t "$TAG_BASE:$CHANNEL" --build-arg GIT_SHA=${{ github.sha }} --build-arg DOCKER_CHANNEL=$CHANNEL .
-      - run: |
-          docker push "$TAG_BASE:$CHANNEL"
+      - uses: docker/build-push-action@v5
+        with:
+          context: .
+          push: true
+          tags: ${{ env.TAG_BASE }}:${{ env.CHANNEL }}
+          build-args: |
+            GIT_SHA=${{ github.sha }}
+            DOCKER_CHANNEL=${{ env.CHANNEL }}
+          cache-from: type=gha
+          cache-to: type=gha,mode=max
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- run: |
rm .dockerignore
docker build -t "$TAG_BASE:$CHANNEL" --build-arg GIT_SHA=${{ github.sha }} --build-arg DOCKER_CHANNEL=$CHANNEL .
- run: |
docker push "$TAG_BASE:$CHANNEL"
- uses: docker/build-push-action@v5
with:
context: .
push: true
tags: ${{ env.TAG_BASE }}:${{ env.CHANNEL }}
build-args: |
GIT_SHA=${{ github.sha }}
DOCKER_CHANNEL=${{ env.CHANNEL }}
cache-from: type=gha
cache-to: type=gha,mode=max
🤖 Prompt for AI Agents
.github/workflows/rainix-docker.yml around lines 47 to 51: replace the manual
docker build and docker push steps with docker/build-push-action to get
BuildKit, caching and multi-platform support; update the job to use - uses:
docker/build-push-action@v4 - with context: ., push: true, tags:
"$TAG_BASE:$CHANNEL", build-args: GIT_SHA=${{ github.sha }},
DOCKER_CHANNEL=$CHANNEL and configure cache-from/cache-to (e.g., type=registry)
as appropriate; ensure any removal of .dockerignore is handled before the action
or omitted if unnecessary and keep environment variables/permissions for pushing
(e.g., set up docker login or use GITHUB_TOKEN/registry credentials) so the
action can authenticate and push.

username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- run: |
rm .dockerignore
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

🧩 Analysis chain

Verify that removing .dockerignore is intentional.

The workflow explicitly removes the .dockerignore file before building. This could lead to unnecessarily large Docker images if build artifacts, node_modules, or other ignored files are included.

Verify whether removing .dockerignore is required:

If the removal is necessary, document why in a comment:

       - run: |
+          # Remove .dockerignore to include build artifacts in the image
           rm .dockerignore
           docker build -t "$TAG_BASE:$CHANNEL" --build-arg GIT_SHA=${{ github.sha }} --build-arg DOCKER_CHANNEL=$CHANNEL .

🏁 Script executed:

#!/bin/bash
# Check if .dockerignore exists and what it contains
if [ -f .dockerignore ]; then
  echo "=== Contents of .dockerignore ==="
  cat .dockerignore
  echo ""
  echo "=== Checking for large directories that might be ignored ==="
  fd -t d 'node_modules|dist|build|.git' --max-depth 3 -x du -sh
else
  echo ".dockerignore does not exist in the repository"
fi

Length of output: 448


Document rationale for removing .dockerignore
Add a comment above rm .dockerignore to explain that .dockerignore ignores the lib directory and other build outputs, so it must be removed to include necessary artifacts in the Docker build context:

       - run: |
+        # Remove .dockerignore so compiled artifacts (lib, build outputs, node_modules, etc.) are included in Docker context
         rm .dockerignore
         docker build -t "$TAG_BASE:$CHANNEL" --build-arg GIT_SHA=${{ github.sha }} --build-arg DOCKER_CHANNEL=$CHANNEL .

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
.github/workflows/rainix-docker.yml around line 48: add a brief comment above
the rm .dockerignore step explaining that the repository's .dockerignore
excludes the lib directory and other build outputs needed in the image, so we
remove .dockerignore to ensure those artifacts are included in the Docker build
context; keep the comment concise and factual.

Comment on lines +1 to +21
FROM node:22.14

# set git sha and docker tag form build time arg to run time env in container
ARG GIT_SHA
ARG DOCKER_CHANNEL
ENV GIT_COMMIT=$GIT_SHA
ENV DOCKER_TAG=$DOCKER_CHANNEL

WORKDIR /price-pusher
ADD . .

WORKDIR apps/price_pusher
CMD ["bash", "-c", "npm run start evm -- \
--price-config-file ./price-config.stable.sample.yaml \
--endpoint \"${ENDPOINT}\" \
--pyth-contract-address \"${PYTH_CONTRACT_ADDRESS}\" \
--price-service-endpoint \"${PRICE_SERVICE_ENDPOINT}\" \
--mnemonic-file <(echo \"${MNEMONIC}\") \
--pushing-frequency \"${PUSHING_FREQUENCY:-300}\" \
--polling-frequency \"${POLLING_FREQUENCY:-5}\" \
--override-gas-price-multiplier \"${GAS_PRICE_MULTIPLIER:-1.1}\""]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Add HEALTHCHECK instruction for container monitoring.

A HEALTHCHECK instruction allows Docker and orchestration platforms to monitor the health of the running container and restart it if necessary.

Consider adding a health check appropriate for your application:

HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
  CMD node healthcheck.js || exit 1

Alternatively, if the price pusher exposes an HTTP endpoint, use a curl-based health check:

HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
  CMD curl -f http://localhost:PORT/health || exit 1
🧰 Tools
🪛 Checkov (3.2.334)

[low] 10-10: Ensure that COPY is used instead of ADD in Dockerfiles

(CKV_DOCKER_4)


[low] 12-12: Ensure that WORKDIR values are absolute paths

(CKV_DOCKER_10)


[low] 1-21: Ensure that HEALTHCHECK instructions have been added to container images

(CKV_DOCKER_2)


[low] 1-21: Ensure that a user for the container has been created

(CKV_DOCKER_3)

🪛 Hadolint (2.14.0)

[error] 10-10: Use COPY instead of ADD for files and folders

(DL3020)


[error] 12-12: Use absolute WORKDIR

(DL3000)

🤖 Prompt for AI Agents
Dockerfile lines 1-21: add a HEALTHCHECK instruction near the end of the
Dockerfile so container orchestrators can detect and restart unhealthy
containers; implement either a lightweight node-based check (add a small
healthcheck.js to the app and ensure it’s copied into the image, then add a
HEALTHCHECK that runs node healthcheck.js with sensible
--interval/--timeout/--start-period/--retries) or an HTTP check (ensure the
service listens on a known PORT and add a HEALTHCHECK that curls the local
/health endpoint with the same timing options); place the HEALTHCHECK after
CMD/ENTRYPOINT or at the end of the file and document the required health
endpoint/script and port in the repo README.

⚠️ Potential issue | 🟠 Major

Security: Add a non-root USER directive.

The container runs as root by default, which poses a security risk. Create and switch to a non-privileged user before running the application.

Add a non-root user:

 WORKDIR /price-pusher
 COPY . .
+RUN groupadd -r appuser && useradd -r -g appuser appuser
+RUN chown -R appuser:appuser /price-pusher

+USER appuser
 WORKDIR /price-pusher/apps/price_pusher

Committable suggestion skipped: line range outside the PR's diff.

🧰 Tools
🪛 Checkov (3.2.334)

[low] 10-10: Ensure that COPY is used instead of ADD in Dockerfiles

(CKV_DOCKER_4)


[low] 12-12: Ensure that WORKDIR values are absolute paths

(CKV_DOCKER_10)


[low] 1-21: Ensure that HEALTHCHECK instructions have been added to container images

(CKV_DOCKER_2)


[low] 1-21: Ensure that a user for the container has been created

(CKV_DOCKER_3)

🪛 Hadolint (2.14.0)

[error] 10-10: Use COPY instead of ADD for files and folders

(DL3020)


[error] 12-12: Use absolute WORKDIR

(DL3000)

🤖 Prompt for AI Agents
In Dockerfile around lines 1 to 21, the image runs as root by default; create a
non-privileged user, set its home/workdir ownership and switch to it before CMD
to reduce attack surface. Add steps to: create a user and group (non-root), set
appropriate ownership (chown) of /price-pusher and apps/price_pusher so the new
user can read/write, set HOME and USER to that non-root account, and switch to
that USER before the CMD; ensure any files or ports the process needs are
accessible by that user and avoid running privileged setup commands after
switching.

@@ -0,0 +1,21 @@
FROM node:22.14
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify that the Node.js base image version is current and secure.

The Dockerfile uses node:22.14 as the base image. Ensure this version is up-to-date and free from known vulnerabilities.


🌐 Web query:

What is the latest LTS version of Node.js?

💡 Result:

The current LTS major is Node.js v22 ("Jod"); the latest release in that line is v22.20.0 (released Sep 24, 2025). [1][2]

Sources:
[1] Node.js v22.20.0 release notes (Sep 24, 2025).
[2] Node.js Releases listing showing v22 as Active LTS.


Update Node.js base image to v22.20.0
Replace FROM node:22.14 with FROM node:22.20.0 (latest Active LTS as of Sep 24, 2025) to include the most recent security patches.

🧰 Tools
🪛 Checkov (3.2.334)

[low] 1-21: Ensure that HEALTHCHECK instructions have been added to container images

(CKV_DOCKER_2)


[low] 1-21: Ensure that a user for the container has been created

(CKV_DOCKER_3)

🤖 Prompt for AI Agents
In Dockerfile around lines 1 to 1, the base image is pinned to node:22.14;
update it to the current Active LTS by replacing that line with FROM
node:22.20.0 so the build uses Node.js v22.20.0 with latest security patches;
ensure no other Dockerfile references or CI cache expectations require the older
tag and rebuild any dependent images.

Comment on lines +9 to +12
WORKDIR /price-pusher
ADD . .

WORKDIR apps/price_pusher
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: Missing dependency installation and build steps.

The Dockerfile copies all files and switches to the apps/price_pusher directory, but there's no npm install or build step. Without installing dependencies, the npm run start command will fail at runtime.

Add dependency installation and build steps after copying files:

 WORKDIR /price-pusher
 COPY . .
+RUN npm install --frozen-lockfile
+RUN npm run build

 WORKDIR apps/price_pusher

Alternatively, if using a monorepo tool like turbo (as indicated in the workflow), adjust accordingly:

 WORKDIR /price-pusher
 COPY . .
+RUN npm install -g pnpm
+RUN pnpm install --frozen-lockfile
+RUN pnpm turbo build --filter @pythnetwork/price-pusher

 WORKDIR apps/price_pusher
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
WORKDIR /price-pusher
ADD . .
WORKDIR apps/price_pusher
WORKDIR /price-pusher
COPY . .
RUN npm install --frozen-lockfile
RUN npm run build
WORKDIR apps/price_pusher
Suggested change
WORKDIR /price-pusher
ADD . .
WORKDIR apps/price_pusher
WORKDIR /price-pusher
COPY . .
RUN npm install -g pnpm
RUN pnpm install --frozen-lockfile
RUN pnpm turbo build --filter @pythnetwork/price-pusher
WORKDIR apps/price_pusher
🧰 Tools
🪛 Checkov (3.2.334)

[low] 10-10: Ensure that COPY is used instead of ADD in Dockerfiles

(CKV_DOCKER_4)


[low] 12-12: Ensure that WORKDIR values are absolute paths

(CKV_DOCKER_10)

🪛 Hadolint (2.14.0)

[error] 10-10: Use COPY instead of ADD for files and folders

(DL3020)


[error] 12-12: Use absolute WORKDIR

(DL3000)

ENV DOCKER_TAG=$DOCKER_CHANNEL

WORKDIR /price-pusher
ADD . .
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Use COPY instead of ADD for regular files.

Docker best practices recommend using COPY instead of ADD for copying files and directories, as ADD has additional features (like extracting tar files) that can lead to unexpected behavior.

Apply this diff:

-ADD . .
+COPY . .
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
ADD . .
COPY . .
🧰 Tools
🪛 Checkov (3.2.334)

[low] 10-10: Ensure that COPY is used instead of ADD in Dockerfiles

(CKV_DOCKER_4)

🪛 Hadolint (2.14.0)

[error] 10-10: Use COPY instead of ADD for files and folders

(DL3020)

🤖 Prompt for AI Agents
In Dockerfile around line 10, the Dockerfile uses ADD . . which is discouraged
for regular file copying; replace the ADD command with COPY . . to follow best
practices (use COPY for simple file/directory copies and reserve ADD only when
you need its extra features like remote URLs or automatic tar extraction).

WORKDIR /price-pusher
ADD . .

WORKDIR apps/price_pusher
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Use absolute WORKDIR path.

The second WORKDIR uses a relative path. Docker best practices recommend using absolute paths for clarity and to avoid ambiguity.

Apply this diff:

-WORKDIR apps/price_pusher
+WORKDIR /price-pusher/apps/price_pusher
🧰 Tools
🪛 Checkov (3.2.334)

[low] 12-12: Ensure that WORKDIR values are absolute paths

(CKV_DOCKER_10)

🪛 Hadolint (2.14.0)

[error] 12-12: Use absolute WORKDIR

(DL3000)

🤖 Prompt for AI Agents
In Dockerfile at line 12, the WORKDIR is set to a relative path
("apps/price_pusher"); change it to an absolute path (for example
"/apps/price_pusher" or the repository canonical app root such as
"/usr/src/app/apps/price_pusher") so the Docker image uses an unambiguous
working directory; update the WORKDIR line to the chosen absolute path.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 10cb706 and bac6da8.

📒 Files selected for processing (1)
  • apps/price_pusher/price-config.stable.sample.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: build
  • GitHub Check: test
  • GitHub Check: build

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 68c9cac and 6df13f2.

📒 Files selected for processing (1)
  • apps/price_pusher/price-config.stable.sample.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: test
  • GitHub Check: build
🔇 Additional comments (1)
apps/price_pusher/price-config.stable.sample.yaml (1)

2-2: Verify that all price feed IDs match the official Pyth documentation for the Stable channel.

The price pusher service monitors both the off-chain and on-chain Pyth price for a configured set of price feeds, so incorrect IDs will cause runtime failures during price updates. Ensure each ID corresponds to the correct asset and market variant (e.g., PRE/POST) according to Pyth's official price feed list.

Comment on lines 1 to 208
- alias: GOOG/USD
id: e65ff435be42630439c96396653a342829e877e2aafaeaf1a10d0ee5fd2cf3f2
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: AMZN/USD
id: b5d0e0fa58a1f8b81498ae670ce93c872d14434b72c364885d4fa1b257cbb07a
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: AMZN/USD.PRE
id: 82c59e36a8e0247e15283748d6cd51f5fa1019d73fbf3ab6d927e17d9e357a7f
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: AMZN/USD.POST
id: 62731dfcc8b8542e52753f208248c3e73fab2ec15422d6f65c2decda71ccea0d
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: AAPL/USD
id: 49f6b65cb1de6b10eaf75e7c03ca029c306d0357e91b5311b175084a5ad55688
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: MSFT/USD
id: d0ca23c1cc005e004ccf1db5bf76aeb6a49218f43dac3d4b275e92de12ded4d1
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: TSLA/USD
id: 16dad506d7db8da01c87581c87ca897a012a153557d4d578c3b9c9e1bc0632f1
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: TSLA/USD.PRE
id: 42676a595d0099c381687124805c8bb22c75424dffcaa55e3dc6549854ebe20a
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: TSLA/USD.POST
id: 2a797e196973b72447e0ab8e841d9f5706c37dc581fe66a0bd21bcd256cdb9b9
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: NVDA/USD
id: b1073854ed24cbc755dc527418f52b7d271f6cc967bbf8d8129112b18860a593
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: NVDA/USD.PRE
id: 61c4ca5b9731a79e285a01e24432d57d89f0ecdd4cd7828196ca8992d5eafef6
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: NVDA/USD.POST
id: 25719379353a508b1531945f3c466759d6efd866f52fbaeb3631decb70ba381f
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: META/USD
id: 78a3e3b8e676a8f73c439f5d749737034b139bbbe899ba5775216fba596607fe
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: GME/USD
id: 6f9cd89ef1b7fd39f667101a91ad578b6c6ace4579d5f7f285a4b06aa4504be6
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: MSTR/USD
id: e1e80251e5f5184f2195008382538e847fafc36f751896889dd3d1b1f6111f09
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: MSTR/USD.PRE
id: 1a11eb21c271f3127e4c9ec8a0e9b1042dc088ccba7a94a1a7d1aa37599a00f6
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: MSTR/USD.POST
id: d8b856d7e17c467877d2d947f27b832db0d65b362ddb6f728797d46b0a8b54c0
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: BRKB/USD
id: e21c688b7fc65b4606a50f3635f466f6986db129bf16979875d160f9c508e8c7
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: SPLG/USD
id: 4dfbf28d72ab41a878afcd4c6d5e9593dca7cf65a0da739cbad9b7414004f82d
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: IAU/USD
id: f703fbded84f7da4bd9ff4661b5d1ffefa8a9c90b7fa12f247edc8251efac914
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: COIN/USD
id: fee33f2a978bf32dd6b662b65ba8083c6773b494f8401194ec1870c640860245
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: COIN/USD.PRE
id: 8bdee6bc9dc5a61b971e31dcfae96fc0c7eae37b2604aa6002ad22980bd3517c
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: COIN/USD.POST
id: 5c3bd92f2eed33779040caea9f82fac705f5121d26251f8f5e17ec35b9559cd4
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: SIVR/USD
id: 0a5ee42b0f7287a777926d08bc185a6a60f42f40a9b63d78d85d4a03ee2e3737
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: CRCL/USD
id: 92b8527aabe59ea2b12230f7b532769b133ffb118dfbd48ff676f14b273f1365
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
- alias: PPLT/USD
id: 782410278b6c8aa2d437812281526012808404aa14c243f73fb9939eeb88d430
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
confidence_ratio: 200 # effectively ignore this trigger
early_update:
time_difference: 150
price_deviation: 0.03
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider extracting common configuration to reduce duplication.

All 26 entries share identical values for time_difference, price_deviation, confidence_ratio, and early_update blocks—only alias and id vary. This creates a maintenance burden: updating the push policy requires changing all 26 entries consistently.

Consider one of these approaches:

  1. YAML anchors/aliases to define defaults once:
defaults: &default_config
  time_difference: 300
  price_deviation: 0.05
  confidence_ratio: 200
  early_update:
    time_difference: 150
    price_deviation: 0.03

- alias: GOOG/USD
  id: e65ff435be42630439c96396653a342829e877e2aafaeaf1a10d0ee5fd2cf3f2
  <<: *default_config
  1. Application-level defaults: Modify the price-pusher to support a defaults section with per-entry overrides, reducing the config to just alias and id for standard entries.
🤖 Prompt for AI Agents
In apps/price_pusher/price-config.stable.sample.yaml lines 1-208 the same
push-policy fields (time_difference, price_deviation, confidence_ratio,
early_update) are duplicated across 26 entries; extract those common values into
a single defaults block and have each entry inherit them. Fix by either (A)
adding a YAML defaults anchor/alias and merging it into each price entry using
the YAML merge key so entries only need alias and id, or (B) extend the
price-pusher config parser to accept a top-level "defaults" section and
programmatically merge defaults into each entry at load time (respecting
per-entry overrides). Update the sample file to use the chosen approach and
adjust any docs/tests to reflect the new config format.

- alias: GOOG/USD
id: e65ff435be42630439c96396653a342829e877e2aafaeaf1a10d0ee5fd2cf3f2
time_difference: 300 # pushes if on-chain is >5min behind
price_deviation: 0.05 # 0.05% deviation trips a push
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Incorrect comment persists despite prior fix claim.

The comment "0.05% deviation" appears on all price_deviation: 0.05 lines throughout the file, but a value of 0.05 represents 5%, not 0.05%. Past review comments indicate this was addressed in commit c345fbf, yet the incorrect comments remain in the current code. This mismatch will confuse operators tuning thresholds and risks misconfiguration.

Apply this diff to fix all occurrences:

-  price_deviation: 0.05   # 0.05% deviation trips a push
+  price_deviation: 0.05   # 5% deviation trips a push

Repeat for all 26 entries in the file.

Also applies to: 12-12, 20-20, 28-28, 36-36, 44-44, 52-52, 60-60, 68-68, 76-76, 84-84, 92-92, 100-100, 108-108, 116-116, 124-124, 132-132, 140-140, 148-148, 156-156, 164-164, 172-172, 180-180, 188-188, 196-196, 204-204

🤖 Prompt for AI Agents
In apps/price_pusher/price-config.stable.sample.yaml around lines
4,12,20,28,36,44,52,60,68,76,84,92,100,108,116,124,132,140,148,156,164,172,180,188,196,204,
the inline comment for price_deviation: 0.05 incorrectly reads "0.05% deviation"
even though the value 0.05 represents 5% (0.05 == 5%), so update each comment to
read "5% deviation" (or "0.05 = 5% deviation") to accurately reflect the numeric
meaning and avoid operator confusion; make this change for all 26 occurrences
listed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants