Implementation of Retrosynthesis Prediction by LocalRetro developed by Yousung Jung's group at KAIST (now in SNU).
Shuan Chen (shuan75@snu.ac.kr)
- Python (version >= 3.6)
- Numpy (version >= 1.16.4)
- PyTorch (version >= 1.0.0)
- RDKit (version >= 2019)
- DGL (version >= 0.5.2)
- DGLLife (version >= 0.2.6)
Create a virtual environment to run the code of LocalRetro.
Install pytorch with the cuda version that fits your device.
cd LocalRetro
conda create -c conda-forge -n rdenv python=3.7 -y
conda activate rdenv
conda install pytorch cudatoolkit=10.2 -c pytorch -y
conda install -c conda-forge rdkit -y
pip install dgl
pip install dgllife
The license has been updated to CC BY-NC-SA 4.0 (Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International). This means:
- Academic use: Free to use, share, and adapt with attribution
- Commercial use: Not permitted without prior written approval from the copyright holder
- Derivative works: Must be shared under the same license
We encourage academic use of this model, but we wish this will not be used for any commercial used without our permission. For commercial licensing inquiries, please contact the developer.
To address the issue raised from the coommnuty (see also #15)., the function get_atom_pair in model_utils.py is updated.
Also, we change the activation function from ReLU to GeLU and recalculate the accuracy using both stereo-aware and stereo-unaware metrics, showing at the bottom of README.md (see #12).
For example, following problem (reaction #270 in test set) is an ester hydrolysis reaction, which has nothing to do with the single bond highlighed in red but somehow changed in the ground truth. The prediction of this retrosynthesis is identified as correct by the stereo-aware metric but wrong by the stereo-unaware metric.
We cleaned the code and made the template more simplied, which yields 658 local reaction templates for USPTO_50K dataset and 20,221 local reaction templates for USPTO_MIT dataset. Therefore we tested the top-k accuracy again and the results are updated at the bottom of README.md. The training takes around 100 minutes on NVIDIA GeForce RTX 3090
Currently, we are cleaning up the codes, and the codes will be uploaded back afterwards.
Shuan Chen and Yousung Jung. Deep Retrosynthetic Reaction Prediction using Local Reactivity and Global Attention, JACS Au 2021.
See the README in ./data to download the raw data files for training and testing the model.
A two-step data preprocessing is needed to train the LocalRetro model.
First go to the data processing folder
cd preprocessing
and extract the reaction template with specified dataset name (default: USPTO_50K).
python Extract_from_train_data.py -d USPTO_50K
This will give you four files, including
(1) atom_templates.csv
(2) bond_templates.csv
(3) template_infos.csv
(4) template_rxnclass.csv (if train_class.csv exists in data folder)
By running
python Run_preprocessing.py -d USPTO_50K
You can get four preprocessed files, including
(1) preprocessed_train.csv
(2) preprocessed_val.csv
(3) preprocessed_test.csv
(4) labeled_data.csv
Go to the localretro folder
cd ../scripts
and run the following to train the model with specified dataset (default: USPTO_50K)
python Train.py -d USPTO_50K
The trained model will be saved at LocalRetro/models/LocalRetro_USPTO_50K.pth
To use the model to test on test set, simply run
python Test.py -d USPTO_50K
to get the raw prediction file saved at LocalRetro/outputs/raw_prediction/LocalRetro_USPTO_50K.txt
Finally you can get the reactants of each prediciton by decoding the raw prediction file
python Decode_predictions.py -d USPTO_50K
The decoded reactants will be saved at
LocalRetro/outputs/decoded_prediction/LocalRetro_USPTO_50K.txt
and
LocalRetro/outputs/decoded_prediction_class/LocalRetro_USPTO_50K.txt
@article{chen2021deep,
title={Deep retrosynthetic reaction prediction using local reactivity and global attention},
author={Chen, Shuan and Jung, Yousung},
journal={JACS Au},
volume={1},
number={10},
pages={1612--1620},
year={2021},
publisher={ACS Publications}
}This project is covered under the CC BY-NC-SA 4.0 (Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International) license. See the LICENSE file for details.
