Skip to content

Commit ee5a34a

Browse files
committed
experiment links transformer
1 parent e09ee89 commit ee5a34a

File tree

9 files changed

+247
-240
lines changed

9 files changed

+247
-240
lines changed

โ€Ždocs/sitemap.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -239,7 +239,7 @@
239239

240240
<url>
241241
<loc>https://nn.labml.ai/experiments/nlp_autoregression.html</loc>
242-
<lastmod>2022-06-25T16:30:00+00:00</lastmod>
242+
<lastmod>2022-06-27T16:30:00+00:00</lastmod>
243243
<priority>1.00</priority>
244244
</url>
245245

โ€Ždocs/transformers/basic/autoregressive_experiment.html

Lines changed: 63 additions & 63 deletions
Large diffs are not rendered by default.

โ€Ždocs/transformers/mha.html

Lines changed: 64 additions & 64 deletions
Large diffs are not rendered by default.

โ€Ždocs/transformers/models.html

Lines changed: 108 additions & 107 deletions
Large diffs are not rendered by default.

โ€Žlabml_nn/transformers/basic/autoregressive_experiment.ipynb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@
1111
"source": [
1212
"[![Github](https://img.shields.io/github/stars/labmlai/annotated_deep_learning_paper_implementations?style=social)](https://github.com/labmlai/annotated_deep_learning_paper_implementations)\n",
1313
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)\n",
14+
"[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)\n",
1415
"\n",
1516
"## Transformer Experiment\n",
1617
"\n",

โ€Žlabml_nn/transformers/basic/autoregressive_experiment.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,11 @@
77
88
# Transformer Auto-Regression Experiment
99
10+
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
11+
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
12+
1013
This trains a simple transformer introduced in [Attention Is All You Need](https://papers.labml.ai/paper/1706.03762)
1114
on an NLP auto-regression task (with Tiny Shakespeare dataset).
12-
13-
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
1415
"""
1516

1617
import torch

โ€Žlabml_nn/transformers/mha.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,9 @@
88
99
# Multi-Headed Attention (MHA)
1010
11+
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
12+
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
13+
1114
This is a tutorial/implementation of multi-headed attention
1215
from paper [Attention Is All You Need](https://papers.labml.ai/paper/1706.03762)
1316
in [PyTorch](https://pytorch.org/).
@@ -17,8 +20,6 @@
1720
with MHA for NLP auto-regression.
1821
1922
[Here is an experiment implementation](basic/autoregressive_experiment.html) that trains a simple transformer.
20-
21-
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
2223
"""
2324

2425
import math

โ€Žlabml_nn/transformers/models.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@
77
---
88
99
# Transformer Encoder and Decoder Models
10+
11+
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/transformers/basic/autoregressive_experiment.ipynb)
12+
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](comet.ml/labml/transformer/ea8c108c2d94434ca3c2bc2b21015082)
1013
"""
1114
import math
1215

โ€Žsetup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55

66
setuptools.setup(
77
name='labml-nn',
8-
version='0.4.125',
8+
version='0.4.126',
99
author="Varuna Jayasiri, Nipun Wijerathne",
1010
author_email="vpjayasiri@gmail.com, hnipun@gmail.com",
1111
description="๐Ÿง‘โ€๐Ÿซ Implementations/tutorials of deep learning papers with side-by-side notes ๐Ÿ“; including transformers (original, xl, switch, feedback, vit), optimizers (adam, radam, adabelief), gans(dcgan, cyclegan, stylegan2), ๐ŸŽฎ reinforcement learning (ppo, dqn), capsnet, distillation, etc. ๐Ÿง ",

0 commit comments

Comments
ย (0)