Skip to content

Diffusion Notebook #127

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jun 9, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
220 changes: 112 additions & 108 deletions docs/diffusion/ddpm/experiment.html

Large diffs are not rendered by default.

88 changes: 44 additions & 44 deletions docs/diffusion/ddpm/index.html

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions docs/optimizers/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -82,12 +82,12 @@ <h2>Generic Adaptive Optimizer Base class and Weight Decay</h2>
<p>We also define a special class for L2 weight decay, so that we don&#x27;t have to implement it inside each of the optimizers, and can easily extend to other weight decays like L1 without changing the optimizers.</p>
<p>Here are some concepts on PyTorch optimizers:</p>
<h3>Parameter groups</h3>
<p>PyTorch optimizers group parameters into sets called groups. Each group can have it&#x27;s own hyper-parameters like learning rates.</p>
<p>PyTorch optimizers group parameters into sets called groups. Each group can have its own hyper-parameters like learning rates.</p>
<p>In most common cases there will be only one group. This is when you initialize your optimizer with,</p>
<pre class="highlight lang-python"><code><span></span><span class="n">Optimizer</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">parameters</span><span class="p">())</span></code></pre>
<p>You can define multiple parameter groups when initializing the optimizer:</p>
<pre class="highlight lang-python"><code><span></span><span class="n">Optimizer</span><span class="p">([{</span><span class="s1">&#39;params&#39;</span><span class="p">:</span> <span class="n">model1</span><span class="o">.</span><span class="n">parameters</span><span class="p">()},</span> <span class="p">{</span><span class="s1">&#39;params&#39;</span><span class="p">:</span> <span class="n">model2</span><span class="o">.</span><span class="n">parameters</span><span class="p">(),</span> <span class="s1">&#39;lr&#39;</span><span class="p">:</span> <span class="mi">2</span><span class="p">}])</span></code></pre>
<p>Here we pass a list of groups. Each group is a dictionary with it&#x27;s parameters under the key &#x27;params&#x27;. You specify any hyper-parameters as well. If the hyper parameters are not defined they will default to the optimizer level defaults.</p>
<p>Here we pass a list of groups. Each group is a dictionary with its parameters under the key &#x27;params&#x27;. You specify any hyper-parameters as well. If the hyper parameters are not defined they will default to the optimizer level defaults.</p>
<p>You can access (and even change) these groups, and their hyper-parameters with <code class="highlight"><span></span><span class="n">optimizer</span><span class="o">.</span><span class="n">param_groups</span></code>
. Most learning rate schedule implementations I&#x27;ve come across do access this and change &#x27;lr&#x27;.</p>
<h3>States</h3>
Expand Down
18 changes: 9 additions & 9 deletions docs/sitemap.xml
Original file line number Diff line number Diff line change
Expand Up @@ -246,7 +246,7 @@

<url>
<loc>https://nn.labml.ai/experiments/arithmetic_dataset.html</loc>
<lastmod>2022-06-02T16:30:00+00:00</lastmod>
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>

Expand Down Expand Up @@ -379,14 +379,14 @@

<url>
<loc>https://nn.labml.ai/diffusion/ddpm/index.html</loc>
<lastmod>2022-03-21T16:30:00+00:00</lastmod>
<lastmod>2022-06-09T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>


<url>
<loc>https://nn.labml.ai/diffusion/ddpm/experiment.html</loc>
<lastmod>2021-10-21T16:30:00+00:00</lastmod>
<lastmod>2022-06-09T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>

Expand Down Expand Up @@ -435,7 +435,7 @@

<url>
<loc>https://nn.labml.ai/optimizers/index.html</loc>
<lastmod>2021-10-19T16:30:00+00:00</lastmod>
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>

Expand Down Expand Up @@ -610,35 +610,35 @@

<url>
<loc>https://nn.labml.ai/transformers/rope/index.html</loc>
<lastmod>2022-05-31T16:30:00+00:00</lastmod>
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>


<url>
<loc>https://nn.labml.ai/transformers/rope/value_pe/arithmetic_experiment.html</loc>
<lastmod>2022-06-02T16:30:00+00:00</lastmod>
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>


<url>
<loc>https://nn.labml.ai/transformers/rope/value_pe/index.html</loc>
<lastmod>2022-06-02T16:30:00+00:00</lastmod>
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>


<url>
<loc>https://nn.labml.ai/transformers/rope/value_pe/experiment.html</loc>
<lastmod>2022-05-31T16:30:00+00:00</lastmod>
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>


<url>
<loc>https://nn.labml.ai/transformers/rope/experiment.html</loc>
<lastmod>2022-05-31T16:30:00+00:00</lastmod>
<lastmod>2022-06-03T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>

Expand Down
3 changes: 2 additions & 1 deletion labml_nn/diffusion/ddpm/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,8 @@
[training code](experiment.html).
[This file](evaluate.html) can generate samples and interpolations from a trained model.

[![View Run](https://img.shields.io/badge/labml-experiment-brightgreen)](https://app.labml.ai/run/a44333ea251411ec8007d1a1762ed686)
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/diffusion/ddpm/experiment.ipynb)
[![Open In Comet](https://images.labml.ai/images/comet.svg?experiment=capsule_networks&file=model)](https://www.comet.ml/labml/diffuse/1260757bcd6148e084ad3a46c38ac5c4?experiment-tab=chart&showOutliers=true&smoothing=0&transformY=smoothing&xAxis=step)
"""
from typing import Tuple, Optional

Expand Down
Loading