Skip to content

Lorentz-equivariant Transformer based on Lorentz Local Canonicalization (Spinner et al. 2025)

Notifications You must be signed in to change notification settings

canisli/LLoCa-Transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lorentz Local Canonicalization (LLoCa)

LLocA is a general framework for extending arbitrary backbones (neural networks, transformers, etc.) to be Lorentz-equivariant. Rather than enforcing equivariance through specialized tensor operations or group-theoretic layers, LLoCa achieves equivariance by locally canonicalizing 4-vectors and performing tensorial message passing between frames. This approach provides a lightweight and flexible alternative to traditional Lorentz-equivariant architectures such as the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr), while retaining exact Lorentz symmetry.

This repo contains a PyTorch implementation of LLoCa applied to a Transformer backbone, including:

  • FramesNet: constructs per-particle local Lorentz frames (L and L_inv) from four-momenta.
  • LLoCaTransformer: transformer backbone that operates in canonical frames; baseline Transformer in src/transformer.py for comparison.
  • Example script: scripts/fit_efps.py trains both models to regress a single Energy Flow Polynomial using src/efp_dataset.py.

References

  • Lorentz Local Canonicalization: How to Make Any Network Lorentz-Equivariant — Spinner et al., 2025. arXiv:2505.20280
  • Lorentz-Equivariance without Limitations — Favaro et al., 2025. arXiv:2508.14898
  • Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics — Spinner et al., 2024. arXiv:2508.14898

About

Lorentz-equivariant Transformer based on Lorentz Local Canonicalization (Spinner et al. 2025)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages