Skip to content

Conversation

@kkondaka
Copy link
Contributor

@kkondaka kkondaka commented Jan 9, 2026

Description

Add documentation for Pipeline DLQ and Prometheus sink

Issues Resolved

Closes #11753 , #11754

Version

2.14

Frontend features

If you're submitting documentation for an OpenSearch Dashboards feature, add a video that shows how a user will interact with the UI step by step. A voiceover is optional.

Checklist

  • [ X] By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license and subject to the Developers Certificate of Origin.
    For more information on following Developer Certificate of Origin and signing off your commits, please check here.

Signed-off-by: Kondaka <krishkdk@amazon.com>
@github-actions
Copy link

github-actions bot commented Jan 9, 2026

Thank you for submitting your PR. The PR states are In progress (or Draft) -> Tech review -> Doc review -> Editorial review -> Merged.

Before you submit your PR for doc review, make sure the content is technically accurate. If you need help finding a tech reviewer, tag a maintainer.

When you're ready for doc review, tag the assignee of this PR. The doc reviewer may push edits to the PR directly or leave comments and editorial suggestions for you to address (let us know in a comment if you have a preference). The doc reviewer will arrange for an editorial review.

@kolchfa-aws kolchfa-aws added Tech review PR: Tech review in progress backport 3.4 labels Jan 9, 2026
```

### DLQ Pipeline
DLQ Pipeline is a dedicated “dead-letter queue” pipeline used to capture any events that Data Prepper is unable to process in any sub-pipeline or stage (source, processor, buffer, or sink). It is defined using a reserved pipeline name, dlq_pipeline, and is configured without a source. The pipeline may include optional processors and routes, but must contain at least one sink to send failed events to an external destination.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we should say "sub-pipeline" as that can be confusing. It is just a pipeline.

```yml
dlq_pipeline:
processor:
- string_converter:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's show an example with a better processor. We should probably remove this one too at some point. We have an uppercase_string processor which is better.


# Prometheus sink

The `prometheus` sink buffers and writes batches of open telemetry metrics data in Prometheus TimeSeries format to Prometheus server using remote write API.Currently, amazon managed prometheus server is supported as the prometheus server. The configured `url` provides the amazon managed prometheus server's remote write endpoint.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

buffers and writes batches

Let's lead with the key aspects. So perhaps:

The prometheus sink writes OpenTelemetry (OTel) metrics data in Prometheus...


# Prometheus sink

The `prometheus` sink buffers and writes batches of open telemetry metrics data in Prometheus TimeSeries format to Prometheus server using remote write API.Currently, amazon managed prometheus server is supported as the prometheus server. The configured `url` provides the amazon managed prometheus server's remote write endpoint.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it "remote write" or "remote-write?" I see both in their docs.

Also, is "Prometheus TimeSeries format" a proper noun? Do we even need to say this? Isn't it implied by Prometheus or does it support other formats?


# Prometheus sink

The `prometheus` sink buffers and writes batches of open telemetry metrics data in Prometheus TimeSeries format to Prometheus server using remote write API.Currently, amazon managed prometheus server is supported as the prometheus server. The configured `url` provides the amazon managed prometheus server's remote write endpoint.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's get the case correct for proper nouns.


The `prometheus` sink only sends metric type data to prometheus server. All other types of data are sent to [DLQ pipeline](#_data-prepper/pipelines/pipeline.md), if configured

The `prometheus` sink sorts metrics by their timestamp in each batch before sending to the prometheus server.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we should reword this to clarify why: To ensure that Prometheus accepts the data since it has restrictions on order of data received. And also to clarify that it supports out-of-order time windows.


## Configuration

`url` | Yes | String | path to prometheus server remote write endpoint
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The descriptions in other plugins are sentences. They should start with a capital letter and end with a period.

`encoding` | No | String | encoding mode (only "snappy" encoding mode is supported)
`remote_write_version` | No | String | remote write version number (only version "0.1.0" supported )
`content_type` | No | String | content type (only "application/x-protobuf")
`out_of_order_time_window` | No | Integer | Time window (in seconds) to accept late-arriving data points.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is now a Duration. Change the type and remove (in seconds). Add the default.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport 3.4 Tech review PR: Tech review in progress

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[DOC] Add documentation for Data Prepper prometheus sink

3 participants