This is the source code for MEMOIR: Lifelong Model Editing with Minimal Overwrite and Informed Retention for LLMs .
The repo is based on EasyEdit repository.
Please check more information on the project website.
MEMOIR is a scalable framework for lifelong model editing that integrates new knowledge without retraining or overwriting existing information. It stores edits in a residual memory module using sparse, sample-dependent masks to minimize interference and reduce forgetting. At inference, MEMOIR retrieves relevant knowledge by matching activation patterns, achieving balanced reliability, generalization and locality performance after thousands of sequential edits.
- Code location:
./easyeditor/models/memoir - Hyperparameters:
./hparams/MEMOIR
-
Create the conda environment:
conda create -n lifelong_edit python=3.9 -y && source activate lifelong_edit
-
Install dependencies:
pip install -r requirements.txt
-
Set the project path:
export PYTHONPATH="path_of_codes:$PYTHONPATH"
-
Login to Hugging Face:
huggingface-cli login --token your_token
Please refer to the EasyEdit repository for dataset prepartion.
python examples/run_wise_editing.py \
--editing_method=MEMOIR \
--hparams_dir=./hparams/MEMOIR/llama3-8b.yaml \
--data_dir=data_dir \
--ds_size=100 \
--data_type=ZsRE \
--sequential_edit \
--top_k=4096 \
--irr_threshold=0.4python examples/run_wise_editing.py \
--editing_method=MEMOIR \
--hparams_dir=./hparams/MEMOIR/llama3-8b.yaml \
--data_dir=data_dir \
--ds_size=100 \
--data_type=hallucination \
--sequential_edit \
--top_k=4096 \
--irr_threshold=0.9python examples/run_wise_editing.py \
--editing_method=MEMOIR \
--hparams_dir=./hparams/MEMOIR/llama3-8b.yaml \
--data_dir=data_dir \
--ds_size=100 \
--data_type=ZsRE \
--sequential_edit \
--RUN_SAVE_BACKGROUND_FEATURES=TrueIf you find this work useful, please cite:
@article{wang2025memoir,
title={MEMOIR: Lifelong Model Editing with Minimal Overwrite and Informed Retention for LLMs},
author={Wang, Ke and Qin, Yiming and Dimitriadis, Nikolaos and Favero, Alessandro and Frossard, Pascal},
journal={arXiv preprint arXiv:2506.07899},
year={2025}
}
