Skip to content

weili419/Mamba-Neural-Operator

Repository files navigation

Mamba-Neural-Operator

Alias-Free Mamba Neural Operator

Jianwei Zheng1 ,Wei Li1,Ni Xu1, Junwei Zhu1, Xiaoxu Lin1, Xiaoqin Zhang1 📧

1 Zhejiang University of Technology

(📧) corresponding author.

NeurIPS 2024 (conference paper)

Abstract

Benefiting from the booming deep learning techniques, neural operators (NO) are considered as an ideal alternative to break the traditions of solving Partial Differential Equations (PDE) with expensive cost. Yet with the remarkable progress, current solutions concern little on the holistic function features--both global and local information-- during the process of solving PDEs. Besides, a meticulously designed kernel integration to meet desirable performance often suffers from a severe computational burden, such as GNO with $O(N(N-1))$, FNO with $O(NlogN)$, and Transformer-based NO with $O(N^2)$. To counteract the dilemma, we propose a mamba neural operator with $O(N)$ computational complexity, namely MambaNO. Functionally, MambaNO achieves a clever balance between global integration, facilitated by state space model of Mamba that scans the entire function, and local integration, engaged with an alias-free architecture. We prove a property of continuous-discrete equivalence to show the capability of MambaNO in approximating operators arising from universal PDEs to desired accuracy. MambaNOs are evaluated on a diverse set of benchmarks with possibly multi-scale solutions and set new state-of-the-art scores, yet with fewer parameters and better efficiency.

Overview

PDE Visualization

Datasets💾

Alternatively, you can run the download_data.py script to download all required data into the appropriate folder.

Note: This script requires wget to be installed on your system.

python3 download_data.py

Train

python3 TrainMambaNO.py

Test

Acknowledgement ❤️

This project is based on CNO (paper, code), VM-UNet (paper, code), Mamba (github).

Thanks for their wonderful works.

Citation

If you find Mamba Neural Operator is useful in your research or applications, please consider giving us a star 🌟 and citing it by the following BibTeX entry.

@inproceedings{NEURIPS2024_5ee553ec,
 author = {Zheng, Jianwei and Li, Wei and Xu, Ni and Zhu, Junwei and Lin, Xiaoxu and Zhang, Xiaoqin},
 booktitle = {Advances in Neural Information Processing Systems},
 editor = {A. Globerson and L. Mackey and D. Belgrave and A. Fan and U. Paquet and J. Tomczak and C. Zhang},
 pages = {52962--52995},
 publisher = {Curran Associates, Inc.},
 title = {Alias-Free Mamba Neural Operator},
 url = {https://proceedings.neurips.cc/paper_files/paper/2024/file/5ee553ec47c31e46a1209bb858b30aa5-Paper-Conference.pdf},
 volume = {37},
 year = {2024}
}

About

Alias-Free Mamba Neural Operator [NeurIPS 2024]

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published