Skip to content

hpca-uji/Neural-Mimicking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

66 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pruning & Recovery Pipeline

This script trains a transfer learning model, prunes it using a selected method, and optionally recovers its performance using fine-tuning or neural-mimicking.

Requirements

Install dependencies:

pip install torch torchvision

Usage

Run the script using:

python main.py --dataset_name DATASET --model_name MODEL --pruning_criteria CRITERIA --recovery_method RECOVERY [--train]

Arguments

  • --dataset_name: Dataset to use. Options:
    • mnist, cifar10, cifar100, imagenette, tiny_imagenet, imagenet1k
  • --model_name: Model architecture. Options:
    • resnet18, resnet50, alexnet, squeezenet, vit
  • --pruning_criteria: Pruning method. Options:
    • l1, random, lobs, nnrelief_paper
  • --recovery_method: Recovery method after pruning. Options:
    • none
    • fine_tuning
    • neural_mimicking_lbfgs
    • neural_mimicking_qr
    • neural_mimicking_svd
    • neural_mimicking_sgd
  • --train: If set, the model will be trained before pruning; otherwise, it will load weights from ./weights/MODEL_DATASET.pth.

Examples

Train and prune with fine-tuning:

python main.py --dataset_name cifar100 --model_name vit --pruning_criteria l1 --recovery_method fine_tuning --train

Load weights and prune with neural mimicking:

python main.py --dataset_name cifar100 --model_name vit --pruning_criteria lobs --recovery_method neural_mimicking_lbfgs

Output

  • Training and evaluation logs
  • Accuracy after each pruning rate (10% to 90%)
  • Saved weights at: ./weights/MODEL_DATASET.pth

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •