-
Notifications
You must be signed in to change notification settings - Fork 696
Make compression level adaptive and configurable #NIT-4153 #4145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
daabca6 to
cc699d5
Compare
❌ 8 Tests Failed:
View the top 3 failed tests by shortest run time
📣 Thoughts on this report? Let Codecov know! | Powered by Codecov |
|
will fix the lint and other stuff |
Tristan-Wilson
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks largely good once build/test issues are fixed, just some minor requests.
arbnode/batch_poster.go
Outdated
| CompressionLevel: 0, | ||
| DASRetentionPeriod: daprovider.DefaultDASRetentionPeriod, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that CompressionLevels is set later when we merge the old and new configs, it's a bit non-obvious to readers, is there a way to set it here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CompressionLevel is set here for a reason: it defines the default value for the CLI argument. But for CompressionLevels would effectively remain unused. I think remove all of them would be a better choice.
For ResolveCompressionLevel, the intent is to first check whether either of the two arguments was explicitly set. Having default values makes it harder to distinguish between “unset” and “explicitly set,” which complicates that logic.
From my perspective, leaving both fields unset and having a single, clear path for determining where the configuration comes from is easier to reason about and less error-prone.
To achieve this, I can remove CompressionLevel: 0 altogether (since it already defaults to zero).
What do you think?
|
Still some errors like this in the logs |
c6ba9d8 to
a7eb894
Compare
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## master #4145 +/- ##
==========================================
- Coverage 33.35% 29.37% -3.98%
==========================================
Files 453 454 +1
Lines 55536 55601 +65
==========================================
- Hits 18524 16335 -2189
- Misses 33774 36303 +2529
+ Partials 3238 2963 -275 |
In this PR we are changing current compression rates and static thresholds to something more configurable by operators.
The previous
--node.batch-poster.compression-levelflag is deprecated in this PR and replaced by:--node.batch-poster.compression-levelswhich allow user to set json array in the following form:Now you can define by which threshold backlog , which compression level should be used and which block/recompression level should be used. There are validation rules on this flag like you shouldn't have increasing compression level with i->n. (more details on the PR code and on the issue #NIT-4153