Skip to content

Commit 58b24f6

Browse files
authored
Diffusion Notebook (labmlai#127)
1 parent 0ce65ad commit 58b24f6

File tree

8 files changed

+496
-168
lines changed

8 files changed

+496
-168
lines changed

docs/diffusion/ddpm/experiment.html

Lines changed: 112 additions & 108 deletions
Large diffs are not rendered by default.

docs/diffusion/ddpm/index.html

Lines changed: 44 additions & 44 deletions
Large diffs are not rendered by default.

docs/optimizers/index.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,12 +82,12 @@ <h2>Generic Adaptive Optimizer Base class and Weight Decay</h2>
8282
<p>We also define a special class for L2 weight decay, so that we don&#x27;t have to implement it inside each of the optimizers, and can easily extend to other weight decays like L1 without changing the optimizers.</p>
8383
<p>Here are some concepts on PyTorch optimizers:</p>
8484
<h3>Parameter groups</h3>
85-
<p>PyTorch optimizers group parameters into sets called groups. Each group can have it&#x27;s own hyper-parameters like learning rates.</p>
85+
<p>PyTorch optimizers group parameters into sets called groups. Each group can have its own hyper-parameters like learning rates.</p>
8686
<p>In most common cases there will be only one group. This is when you initialize your optimizer with,</p>
8787
<pre class="highlight lang-python"><code><span></span><span class="n">Optimizer</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">parameters</span><span class="p">())</span></code></pre>
8888
<p>You can define multiple parameter groups when initializing the optimizer:</p>
8989
<pre class="highlight lang-python"><code><span></span><span class="n">Optimizer</span><span class="p">([{</span><span class="s1">&#39;params&#39;</span><span class="p">:</span> <span class="n">model1</span><span class="o">.</span><span class="n">parameters</span><span class="p">()},</span> <span class="p">{</span><span class="s1">&#39;params&#39;</span><span class="p">:</span> <span class="n">model2</span><span class="o">.</span><span class="n">parameters</span><span class="p">(),</span> <span class="s1">&#39;lr&#39;</span><span class="p">:</span> <span class="mi">2</span><span class="p">}])</span></code></pre>
90-
<p>Here we pass a list of groups. Each group is a dictionary with it&#x27;s parameters under the key &#x27;params&#x27;. You specify any hyper-parameters as well. If the hyper parameters are not defined they will default to the optimizer level defaults.</p>
90+
<p>Here we pass a list of groups. Each group is a dictionary with its parameters under the key &#x27;params&#x27;. You specify any hyper-parameters as well. If the hyper parameters are not defined they will default to the optimizer level defaults.</p>
9191
<p>You can access (and even change) these groups, and their hyper-parameters with <code class="highlight"><span></span><span class="n">optimizer</span><span class="o">.</span><span class="n">param_groups</span></code>
9292
. Most learning rate schedule implementations I&#x27;ve come across do access this and change &#x27;lr&#x27;.</p>
9393
<h3>States</h3>

docs/sitemap.xml

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -246,7 +246,7 @@
246246

247247
<url>
248248
<loc>https://nn.labml.ai/experiments/arithmetic_dataset.html</loc>
249-
<lastmod>2022-06-02T16:30:00+00:00</lastmod>
249+
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
250250
<priority>1.00</priority>
251251
</url>
252252

@@ -379,14 +379,14 @@
379379

380380
<url>
381381
<loc>https://nn.labml.ai/diffusion/ddpm/index.html</loc>
382-
<lastmod>2022-03-21T16:30:00+00:00</lastmod>
382+
<lastmod>2022-06-09T16:30:00+00:00</lastmod>
383383
<priority>1.00</priority>
384384
</url>
385385

386386

387387
<url>
388388
<loc>https://nn.labml.ai/diffusion/ddpm/experiment.html</loc>
389-
<lastmod>2021-10-21T16:30:00+00:00</lastmod>
389+
<lastmod>2022-06-09T16:30:00+00:00</lastmod>
390390
<priority>1.00</priority>
391391
</url>
392392

@@ -435,7 +435,7 @@
435435

436436
<url>
437437
<loc>https://nn.labml.ai/optimizers/index.html</loc>
438-
<lastmod>2021-10-19T16:30:00+00:00</lastmod>
438+
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
439439
<priority>1.00</priority>
440440
</url>
441441

@@ -610,35 +610,35 @@
610610

611611
<url>
612612
<loc>https://nn.labml.ai/transformers/rope/index.html</loc>
613-
<lastmod>2022-05-31T16:30:00+00:00</lastmod>
613+
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
614614
<priority>1.00</priority>
615615
</url>
616616

617617

618618
<url>
619619
<loc>https://nn.labml.ai/transformers/rope/value_pe/arithmetic_experiment.html</loc>
620-
<lastmod>2022-06-02T16:30:00+00:00</lastmod>
620+
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
621621
<priority>1.00</priority>
622622
</url>
623623

624624

625625
<url>
626626
<loc>https://nn.labml.ai/transformers/rope/value_pe/index.html</loc>
627-
<lastmod>2022-06-02T16:30:00+00:00</lastmod>
627+
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
628628
<priority>1.00</priority>
629629
</url>
630630

631631

632632
<url>
633633
<loc>https://nn.labml.ai/transformers/rope/value_pe/experiment.html</loc>
634-
<lastmod>2022-05-31T16:30:00+00:00</lastmod>
634+
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
635635
<priority>1.00</priority>
636636
</url>
637637

638638

639639
<url>
640640
<loc>https://nn.labml.ai/transformers/rope/experiment.html</loc>
641-
<lastmod>2022-05-31T16:30:00+00:00</lastmod>
641+
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
642642
<priority>1.00</priority>
643643
</url>
644644

labml_nn/diffusion/ddpm/__init__.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -157,7 +157,8 @@
157157
[training code](experiment.html).
158158
[This file](evaluate.html) can generate samples and interpolations from a trained model.
159159
160-
[![View Run](https://img.shields.io/badge/labml-experiment-brightgreen)](https://app.labml.ai/run/a44333ea251411ec8007d1a1762ed686)
160+
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/diffusion/ddpm/experiment.ipynb)
161+
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](https://www.comet.ml/labml/diffuse/1260757bcd6148e084ad3a46c38ac5c4?experiment-tab=chart&showOutliers=true&smoothing=0&transformY=smoothing&xAxis=step)
161162
"""
162163
from typing import Tuple, Optional
163164

0 commit comments

Comments
 (0)