-
Notifications
You must be signed in to change notification settings - Fork 2
Synchronize test-suite branch with main #152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…e each repetition's results in different subdirectory (currently breaks performance test graph generation
Ran with golangci-lint run ./... -E stylecheck,revive,gocritic also with additional commented out code being removed
send ping to have pong be added to tracker, and then have it update outpath
Closes #113
…before saving and plotting the aggregated data
…remeasure peerformance test results accordingly
Add mDNS
closes #111
Closes #114
…nts between repetitions
…s performance test reliability
…nce-tests Repeated performance tests, performance tests in CI
…ark to stop extra cancels
…d while reading RFC.
…session. Addresses #135
Closes #132
* Add artifact for performance test data (with commit hash as unique ID), and allow multiple performance tests per CI run by putting test number in filenames * Try to download artifact from other commit in workflow * Refine downloading artifact from other commit * Try to fix workflow failure * Debug other failure * .. * Fix failure (typo in zip name) * Revert adding index to performance test data (was not necessary due to files being in subfolders), small changes when saving performance test data * Forgot to remove index at one point in code * Add script to compare performance between 2 performance tests, and deploy it in CI * Debug workflow failure * Attempt fix * .. * Ignore comparison artifact's workflow conclusion * Account for edge case of baseline performance value being 0, which would make comparison result in worse performance for any non-zero value, no matter how small * Add performance result in workflow job summary * Remove debug job steps * Test worse performance by simulating extra packet loss * Fix logic bug in edge case handling * Retry testing worse performance after fixing bug * Update path regex to solve problem of no files being iterated over in comparison, add more detailed logging * Add debug step * Fix wrongly specified path to new performance test data * Attempt to fix syntax error, increase packet loss * Replace case statement by if elif else * Fix syntax error * Debug exit_code variable * Continue debug * New approach: pipe comparison output directly to job summary, also add edge case for better performance * Format comparison script output as markdown * Fix wrong redirect variable * Get rid of simulated packet loss, fix bug with booleans in comparison script * Uncomment other workflows * Make performance change notifications more readable * Exit successfully with warning if downloading artifact of previous performance test fails * Prevent stops dependent on successful artifact download from running when this step fails * Prevent false positives/negatives in comparison by making bounds more lenient and repeating performance test * Make bounds more lenient for small values
* Update log levels (remove trace, add warn/error) * Add trace log level to test client, undo removal in system test scripts * Dockerfile for image to run system tests * Fix error occurring in Docker container Python version * Optimize system tests speed - Replace sleep 1s by sleep 0.1s - Instead of logging and checking both IPv4 and IPv6 HTTP server connections to make sure the peer is finished, only check the IPv6 server (since the peers connect to this one last) - Add a sleep period to desynchronize the peers. This avoids two handshakes potentially being initiated at the same time, causing a 5 second backoff period - In tail commands, add flag -s 0.1 to specify a polling rate of 100ms. Leaving this unspecified drastically slows down the commands when running in Docker containers * Only build the eduP2P client and server binaries when new -b flag is specified (can be omitted for speed if binaries have already been built), and use the flag in CI * Fix wrong variable in -b flag check * Add -t option to system_tests.sh to specify how many threads should run in parallel * Monitor progress of each thread and add script to display progress to the user * Delete conntrack entries before each test to ensure only connections made in the current test are present in the logs * Add sleep to both peers before they start connecting. This sleep prevents failures with the NAT hairpinning tests, which seem to be caused by the nftables rules not having enough time to be added before the peers try to connect * Refactor Dockerfile to optimize caching by installing requirements before cloning whole repository, and add build argument to specifiy which branch to clone * Use parallel system tests in CI, speed up by caching docker build and not installing Python dependencies * Document parallel system tests * Move new features funded by NLnet to separate changelog, and add parallel system tests feature * Add -b flag to sequential system tests to ensure eduP2P binaries exist before running the tests * Remove unused variable * Add peer http servers output to test logs
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
No description provided.