From a3932fdc813ad167f2d53d78ad65a510678b3167 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Kaan=20Enes=20=C3=87ift=C3=A7i?= Date: Sun, 3 Jul 2022 09:16:16 +0300 Subject: [PATCH] Update README.md Dead link at line 568 replaced with working link --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 343153026..0aa0c63d7 100644 --- a/README.md +++ b/README.md @@ -565,7 +565,7 @@ target hidden state at the top layer of a vanilla seq2seq model. The function Various implementations of attention mechanisms can be found in -[attention_wrapper.py](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/seq2seq/python/ops/attention_wrapper.py). +[attention_wrapper.py](https://github.com/tensorflow/addons/blob/master/tensorflow_addons/seq2seq/attention_wrapper.py). ***What matters in the attention mechanism?***