-
Notifications
You must be signed in to change notification settings - Fork 634
Add documentation for Pipeline DLQ and Prometheus sink #11755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: Kondaka <krishkdk@amazon.com>
|
Thank you for submitting your PR. The PR states are In progress (or Draft) -> Tech review -> Doc review -> Editorial review -> Merged. Before you submit your PR for doc review, make sure the content is technically accurate. If you need help finding a tech reviewer, tag a maintainer. When you're ready for doc review, tag the assignee of this PR. The doc reviewer may push edits to the PR directly or leave comments and editorial suggestions for you to address (let us know in a comment if you have a preference). The doc reviewer will arrange for an editorial review. |
| ``` | ||
|
|
||
| ### DLQ Pipeline | ||
| DLQ Pipeline is a dedicated “dead-letter queue” pipeline used to capture any events that Data Prepper is unable to process in any sub-pipeline or stage (source, processor, buffer, or sink). It is defined using a reserved pipeline name, dlq_pipeline, and is configured without a source. The pipeline may include optional processors and routes, but must contain at least one sink to send failed events to an external destination. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we should say "sub-pipeline" as that can be confusing. It is just a pipeline.
| ```yml | ||
| dlq_pipeline: | ||
| processor: | ||
| - string_converter: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's show an example with a better processor. We should probably remove this one too at some point. We have an uppercase_string processor which is better.
|
|
||
| # Prometheus sink | ||
|
|
||
| The `prometheus` sink buffers and writes batches of open telemetry metrics data in Prometheus TimeSeries format to Prometheus server using remote write API.Currently, amazon managed prometheus server is supported as the prometheus server. The configured `url` provides the amazon managed prometheus server's remote write endpoint. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
buffers and writes batches
Let's lead with the key aspects. So perhaps:
The
prometheussink writes OpenTelemetry (OTel) metrics data in Prometheus...
|
|
||
| # Prometheus sink | ||
|
|
||
| The `prometheus` sink buffers and writes batches of open telemetry metrics data in Prometheus TimeSeries format to Prometheus server using remote write API.Currently, amazon managed prometheus server is supported as the prometheus server. The configured `url` provides the amazon managed prometheus server's remote write endpoint. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it "remote write" or "remote-write?" I see both in their docs.
Also, is "Prometheus TimeSeries format" a proper noun? Do we even need to say this? Isn't it implied by Prometheus or does it support other formats?
|
|
||
| # Prometheus sink | ||
|
|
||
| The `prometheus` sink buffers and writes batches of open telemetry metrics data in Prometheus TimeSeries format to Prometheus server using remote write API.Currently, amazon managed prometheus server is supported as the prometheus server. The configured `url` provides the amazon managed prometheus server's remote write endpoint. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's get the case correct for proper nouns.
|
|
||
| The `prometheus` sink only sends metric type data to prometheus server. All other types of data are sent to [DLQ pipeline](#_data-prepper/pipelines/pipeline.md), if configured | ||
|
|
||
| The `prometheus` sink sorts metrics by their timestamp in each batch before sending to the prometheus server. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should reword this to clarify why: To ensure that Prometheus accepts the data since it has restrictions on order of data received. And also to clarify that it supports out-of-order time windows.
|
|
||
| ## Configuration | ||
|
|
||
| `url` | Yes | String | path to prometheus server remote write endpoint |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The descriptions in other plugins are sentences. They should start with a capital letter and end with a period.
| `encoding` | No | String | encoding mode (only "snappy" encoding mode is supported) | ||
| `remote_write_version` | No | String | remote write version number (only version "0.1.0" supported ) | ||
| `content_type` | No | String | content type (only "application/x-protobuf") | ||
| `out_of_order_time_window` | No | Integer | Time window (in seconds) to accept late-arriving data points. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is now a Duration. Change the type and remove (in seconds). Add the default.
Description
Add documentation for Pipeline DLQ and Prometheus sink
Issues Resolved
Closes #11753 , #11754
Version
2.14
Frontend features
If you're submitting documentation for an OpenSearch Dashboards feature, add a video that shows how a user will interact with the UI step by step. A voiceover is optional.
Checklist
For more information on following Developer Certificate of Origin and signing off your commits, please check here.