Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #97 +/- ##
==========================================
+ Coverage 84.81% 85.15% +0.33%
==========================================
Files 10 10
Lines 540 687 +147
==========================================
+ Hits 458 585 +127
- Misses 82 102 +20 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
This PR is ready for review. @mroavi , @ArrogantGao |
|
Can this code be used to evaluate the joint distribution of multi-variables? |
No. Is that doable in BP? |
I don't think that is possible. Then marginals of single varibles can be calculated by pure forward process, why the function |
Hard to explain, maybe we can schedule a discussion. In short, I use this trick to compute the "message" sent from tensor to multiple hyperedges simultaneouly. |
|
I could not run the test suit in Julia 1.11.5. I get the following error: (TensorInference) pkg> resolve
ERROR: Unsatisfiable requirements detected for package Statistics [10745b16]:
Statistics [10745b16] log:
├─possible versions are: 1.11.0 - 1.11.1 or uninstalled
└─restricted to versions 1.10.0 by an explicit requirement — no versions leftIt looks like the new dependency Could you please check if the Also, I noticed that our |
That is strange, I am also using julia 1.11.5 and I confirm that the tests are runable. |
|
Updates:
|
|
I managed to run the test suite. All tests passed. @GiggleLiu I have a question: In a graph/network with loops (and relatively high complexity), Belief Propagation should be faster at calculating the marginals of all variables compared to an exact approach. I was wondering if you had a chance to check or observe this? |
I have not done a serious benchmark. But it should be true since its computational cost is only
A typical number of iterations to converge is 10^2 in the tests. However, @ArrogantGao told me that in some corner cases, it may fail to converge. BTW, @ArrogantGao, I borrowed the default damping factor from your code, do you have any insight about this number? |
I borrowed the dumping factor from https://github.com/stecrotti/BeliefPropagation.jl. An example of the corner case is the Antiferromagnetic spin-glass with uniform |
The run time of BP is quite complex, in general if it converge its cost is linear to the problem size. |
New features
This PR implements a simple Belief propagation algorithm for computing the marginal probabilities. This method is exact for tree-like tensor network. It can be used to gauge tensor networks (for data compression), even if it is not exact. The following APIs are added:
BeliefPropgation: create a belief propagation model instance, which can be derived from the UAI model.belief_propagate: perform message passing until convergence, returns a BPState object and extra info.random_tensor_train_uaiandrandom_matrix_product_uai: two UAI models for testing.Example
Some API changes:
marskeyword ofTensorNetworkModelargument tounity_tensors_labelsto be more accurate.varsfield ofTensorNetworkModeltonvars. It seems not nessesary to name the variables.TensorNetworkModel.OMEinsum.