Conversation
|
I acknowledge that I have seen this, but this will take more time for me to merge, because I want to port this to the generator script. (Which I need to do in a VM due to Python 2. Rewriting the generator in Rust would be nice in theory but never important enough. I tried porting the script to Python 3 in September but gave up, because Python 3 had done disruptive-enough changes to how comparison methods work to make porting non-trivial.) People have complained about test data being included before, so I guess it's time to remove it. It remains to be seen if Linux distros will then complain about the crates.io package not being complete. |
|
I've moved test data to a separate PR to unblock that change. |
|
As for the generation script, maybe you can make it write the arrays to different Rust files of a helper crate, and use the Rust code I've provided as a post-processing step? It's not elegant, but that doesn't seem that important given how rarely if ever this needs to run. |
|
Acknowledgement that I haven't forgotten about this. Happily, thanks to #124, the generator script now works with Python 3. |
I've noticed that
encoding_rsis in the top 10 crates using most bandwidth of all crates.io.I've managed to make the package 3× smaller:
test_dataand excluded them and the data from the tarball. All other tests still work even when run from the crates.io package.[u8; 2]tables withinclude_bytes!. These tables were using 18 bytes to represent two. Now the bytes aren't even loaded when their feature flags are off. I haven't changed theu16tables, because they're trickier due to alignment.The
generate-encoding-data.pyis for Python 2, so I was unable to run it. I've dumped the data this way:The diff for deletion is huge, but commits should be viable individually.