Skip to content

Commit ab4264c

Browse files
committed
comet links fix
1 parent 2891504 commit ab4264c

File tree

7 files changed

+7
-7
lines changed

7 files changed

+7
-7
lines changed

docs/transformers/basic/autoregressive_experiment.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@
7070
<a href='#section-0'>#</a>
7171
</div>
7272
<h1>Transformer Auto-Regression Experiment</h1>
73-
<p><a href="https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb"><img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"></a> <a href="comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082"><img alt="Open In Comet" src="https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model"></a></p>
73+
<p><a href="https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb"><img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"></a> <a href="https://comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082"><img alt="Open In Comet" src="https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model"></a></p>
7474
<p>This trains a simple transformer introduced in <a href="https://papers.labml.ai/paper/1706.03762">Attention Is All You Need</a> on an NLP auto-regression task (with Tiny Shakespeare dataset).</p>
7575

7676
</div>

docs/transformers/mha.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@
6969
<a href='#section-0'>#</a>
7070
</div>
7171
<h1>Multi-Headed Attention (MHA)</h1>
72-
<p><a href="https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb"><img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"></a> <a href="comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082"><img alt="Open In Comet" src="https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model"></a></p>
72+
<p><a href="https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb"><img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"></a> <a href="https://comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082"><img alt="Open In Comet" src="https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model"></a></p>
7373
<p>This is a tutorial/implementation of multi-headed attention from paper <a href="https://papers.labml.ai/paper/1706.03762">Attention Is All You Need</a> in <a href="https://pytorch.org/">PyTorch</a>. The implementation is inspired from <a href="https://nlp.seas.harvard.edu/2018/04/03/attention.html">Annotated Transformer</a>.</p>
7474
<p>Here is the <a href="basic/autoregressive_experiment.html">training code</a> that uses a basic transformer with MHA for NLP auto-regression.</p>
7575
<p><a href="basic/autoregressive_experiment.html">Here is an experiment implementation</a> that trains a simple transformer.</p>

docs/transformers/models.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@
6969
<a href='#section-0'>#</a>
7070
</div>
7171
<h1>Transformer Encoder and Decoder Models</h1>
72-
<p><a href="https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb"><img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"></a> <a href="comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082"><img alt="Open In Comet" src="https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model"></a></p>
72+
<p><a href="https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb"><img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"></a> <a href="https://comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082"><img alt="Open In Comet" src="https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model"></a></p>
7373

7474
</div>
7575
<div class='code'>

labml_nn/transformers/basic/autoregressive_experiment.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"source": [
1212
"[![Github](https://img.shields.io/github/stars/labmlai/annotated_deep_learning_paper_implementations?style=social)](https://github.com/labmlai/annotated_deep_learning_paper_implementations)\n",
1313
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)\n",
14-
"[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)\n",
14+
"[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](https://comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)\n",
1515
"\n",
1616
"## Transformer Experiment\n",
1717
"\n",

labml_nn/transformers/basic/autoregressive_experiment.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
# Transformer Auto-Regression Experiment
99
1010
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
11-
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
11+
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](https://comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
1212
1313
This trains a simple transformer introduced in [Attention Is All You Need](https://papers.labml.ai/paper/1706.03762)
1414
on an NLP auto-regression task (with Tiny Shakespeare dataset).

labml_nn/transformers/mha.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
# Multi-Headed Attention (MHA)
1010
1111
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
12-
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
12+
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](https://comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
1313
1414
This is a tutorial/implementation of multi-headed attention
1515
from paper [Attention Is All You Need](https://papers.labml.ai/paper/1706.03762)

labml_nn/transformers/models.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
# Transformer Encoder and Decoder Models
1010
1111
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
12-
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
12+
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](https://comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
1313
"""
1414
import math
1515

0 commit comments

Comments
 (0)