- Run
train_and_test.ipynbto train and test all models - To use them on command line, do:
jupyter nbconvert --to=script train_and_test.ipynbipython train_and_test.py
- seq2seq: Sequence to Sequence Learning with Neural Networks
- attention: Neural Machine Translation By Jointly Learning To Align And Translate
- effective approaches: Effective Approaches to Attention-based Neural Machine Translation

- coverage: Modeling Coverage for Neural Machine Translation
Click on the model name to download trained models
| Model (Seq2Seq) | Bleu-Score |
|---|---|
| Linguistic coverage | 0.089 |
| General attention | 0.087 |
| Fertility coverage | 0.0829 |
| Vanilla | 0.082 |
| Concat attention | 0.0814 |
| Dot attention | 0.079 |
| MLP attention | 0.072 |