-
Use Conda to install the necessary dependencies in a new environment. From the root of the repository, run:
conda env create -f environment.yml conda activate ppicker
-
Install ProPicker itself:
pip install . -
You need the checkpoint of our pre-trained model, as well as the checkpoint of the TomoTwin model we used as prompt encoder.
-
You can download the ProPicker checkpoint here here
-
You can download the TomoTwin checkpoint by running
bash download_tomotwin_ckpt.sh
After downloading, place the files in the
ProPickerdirectory. -
-
Set the following environment variables to point to the model files:
export PROPICKER_MODEL_FILE=/abs/path/to/propicker.ckpt export TOMOTWIN_MODEL_FILE=/abs/path/to/tomotwin.pth
There are two main workflows: prompt-based picking and fine-tuning. We provide tutorials for both.
Prompt-based picking can be done via both the Python package and the GUI/CLI. There are separate tutorials for both:
-
Python Package Tutorial: See the
TUTORIAL1:empiar10988_prompt_based_picking.ipynbnotebook. -
GUI/CLI Tutorial: See the our gui_tutorial.
Fine-tuning currently only works via the Python package and additional scripts. See the TUTORIAL2:empiar10988_fine_tuning.ipynb notebook, which continues from the first tutorial.
Training from scratch lives in propicker/training_from_scratch/train.py, with parameters in propicker/training_from_scratch/train_cfg.py.
To download the training data, you can use datasets/download_train_data.sh.
Note: The training data is large, so you might want to download it to a different location. To do this, modify datasets/download_train_data.sh; also adjust the training data path in propicker/paths.py.
This repository contains code copied and modified from the following projects:
All derived code is explicitly marked as such.