Skip to content

Compute and cache discrete masks#465

Open
younik wants to merge 6 commits intomasterfrom
improve-discrete-efficiecy
Open

Compute and cache discrete masks#465
younik wants to merge 6 commits intomasterfrom
improve-discrete-efficiecy

Conversation

@younik
Copy link
Collaborator

@younik younik commented Jan 29, 2026

Fixes point 2 of #377

@younik younik requested a review from josephdviviano January 30, 2026 16:36
@younik younik marked this pull request as ready for review January 30, 2026 16:36
@codecov
Copy link

codecov bot commented Jan 30, 2026

Codecov Report

❌ Patch coverage is 77.36842% with 43 lines in your changes missing coverage. Please review.
✅ Project coverage is 74.32%. Comparing base (2b03b38) to head (529d424).
⚠️ Report is 1 commits behind head on master.

Files with missing lines Patch % Lines
src/gfn/gym/bitSequence.py 40.90% 38 Missing and 1 partial ⚠️
src/gfn/utils/distributed.py 0.00% 2 Missing ⚠️
src/gfn/gym/set_addition.py 95.65% 0 Missing and 1 partial ⚠️
src/gfn/states.py 97.14% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #465      +/-   ##
==========================================
- Coverage   74.45%   74.32%   -0.14%     
==========================================
  Files          47       47              
  Lines        6890     6951      +61     
  Branches      820      824       +4     
==========================================
+ Hits         5130     5166      +36     
- Misses       1449     1475      +26     
+ Partials      311      310       -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Collaborator

@josephdviviano josephdviviano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm approving because I dont have a better idea of what to do, but I worry about moving the logic into the states class making things more difficult.

forward_masks[~is_done, -1] = False
return forward_masks

def _compute_backward_masks(self) -> torch.Tensor:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is good work, but I'm worried that moving all of this logic into the States class makes the code more cumbersome and confusing to work with for other people. Is there another way?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants