Skip to content

caetas/DisCoPatch

Repository files navigation

PWC PWC PWC

Python Code style: black Imports: isort documentation wandb dvc Hydra security: bandit

DisCoPatch

Overview of DisCoPatch's architecture.

The official implementation of DisCoPatch in PyTorch.

Prerequisites

You will need:

  • python (see pyproject.toml for full version)
  • Git
  • Make
  • a .secrets file with the required secrets and credentials
  • load environment variables from .env
  • CUDA >= 12.4

Installation

Clone this repository (requires git ssh keys)

git clone --recursive git@github.com:caetas/DisCoPatch.git
cd discopatch

Install dependencies

conda env create -f environment.yml
conda activate python3.11

or if environment already exists

conda activate python3.11

Using Linux

Setup the virtualenv using make file recipe

(python3.11) $ make setup-all

You might be required to run the following command once to setup the automatic activation of the conda environment and the virtualenv:

direnv allow

Feel free to edit the .envrc file if you prefer to activate the environments manually.

On Windows

You can setup the virtualenv by running the following commands:

python -m venv .venv-dev
.venv-dev/Scripts/Activate.ps1
python -m pip install --upgrade pip setuptools
python -m pip install -r requirements/requirements.txt

To run the code please remember to always activate both environments:

conda activate python3.11
.venv/Scripts/Activate.ps1

OOD Benchmark

The evaluation of these models closely follows OpenOOD's benchmark. Three types of OOD levels are defined: Near-OOD, which exhibits semantic shifts compared to ID datasets; Far-OOD, which encompasses both semantic and domain shifts; Covariate Shift OOD, which involves corruptions within the ID set. There are also four well-defined ID datasets:

  • ImageNet-1K
    • Near-OOD: SSB-hard, NINCO
    • Far-OOD: iNaturalist, DTD, OpenImage-O
    • Covariate Shift OOD: ImageNet(-C)

Datasets Availability

ImageNet-1k is automatically downloaded from HuggingFace when you use the DataLoader.

The remaining datasets can be downloaded using datasets_download.py by running the following commands:

cd src/discopatch
python datasets_download.py [--imagenet]

Note: Use the --imagenet flag if you want to download ImageNet-C.

Model

Train and Evaluate Models

The commands required to train and evaluate each of the models are provided in the documentation section: DisCoPatch.md

Pre-trained Checkpoint

You can download the pre-trained DisCoPatch checkpoint using this link.

Results for DisCoPatch-64

OOD Shift Dataset AUROC FPR@95
Near-OOD SSB-hard
NINCO
95.8%
94.3%
19.8%
39.0%
Far-OOD iNaturalist
DTD
OpenImage-O
99.1%
96.4%
94.4%
3.6%
18.9%
29.7%
Covariate Shift ImageNet-1K(-C) 97.2% 10.6%

Experiment Tracking

The code examples are setup to use Weights & Biases as a tool to track your training runs. Please refer to the full documentation if required or follow the following steps:

  1. Create an account in Weights & Biases

  2. If you have installed the requirements you can skip this step. If not, activate the conda environment and the virtualenv and run:

    pip install wandb
  3. Run the following command and insert you API key when prompted:

    wandb login

Repository Information

Dev

See the Developer guidelines for more information.

Contributing

Contributions of any kind are welcome. Please read CONTRIBUTING.md for details and the process for submitting pull requests to us.

License

This project is licensed under the terms of the MIT license. See LICENSE for more details.

References

All the repositories used to generate this code are mentioned in each of the corresponding files. We would like to list them in no particular order:

Citation

If you publish work that uses DisCoPatch, please cite DisCoPatch.

BibTex information will be added later

About

The official implementation of DisCoPatch in PyTorch.

Resources

Contributing

Security policy

Stars

Watchers

Forks

Packages

No packages published