Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Main branch in the docs instead of old Master branch #2164

Merged
merged 1 commit into from
Jan 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 12 additions & 12 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ We welcome your pull requests (PR) for updates and fixes.
1. If you haven't already, complete the Contributor License Agreement
("CLA").
1. Fork the repo and create a branch from
[`master`](https://github.com/pytorch/tutorials).
[`main`](https://github.com/pytorch/tutorials).
1. Test your code.
1. Lint your code with a tool such as
[Pylint](https://pylint.pycqa.org/en/latest/).
Expand Down Expand Up @@ -88,7 +88,7 @@ commonly-used storage service, such as Amazon S3, and instructing your
users to download the data at the beginning of your tutorial.

The
[Makefile](https://github.com/pytorch/tutorials/blob/master/Makefile)
[Makefile](https://github.com/pytorch/tutorials/blob/main/Makefile)
that we use to build the tutorials contains automation that downloads
required data files.

Expand Down Expand Up @@ -248,14 +248,14 @@ For Python files, our CI system runs your code during each build.
[https://github.com/pytorch/tutorials](https://github.com/pytorch/tutorials)

1. Put the tutorial in one of the
[`beginner_source`](https://github.com/pytorch/tutorials/tree/master/beginner_source),
[`intermediate_source`](https://github.com/pytorch/tutorials/tree/master/intermediate_source),
[`advanced_source`](https://github.com/pytorch/tutorials/tree/master/advanced_source)
[`beginner_source`](https://github.com/pytorch/tutorials/tree/main/beginner_source),
[`intermediate_source`](https://github.com/pytorch/tutorials/tree/main/intermediate_source),
[`advanced_source`](https://github.com/pytorch/tutorials/tree/main/advanced_source)
based on the technical level of the content. For recipes, put the
recipe in
[`recipes_source`](https://github.com/pytorch/tutorials/tree/master/recipes_source).
[`recipes_source`](https://github.com/pytorch/tutorials/tree/main/recipes_source).
In addition, for recipes, add the recipe in the recipes
[README.txt](https://github.com/pytorch/tutorials/blob/master/recipes_source/recipes/README.txt)
[README.txt](https://github.com/pytorch/tutorials/blob/main/recipes_source/recipes/README.txt)
file.


Expand All @@ -266,9 +266,9 @@ search, you need to include it in `index.rst`, or for recipes, in
`recipes_index.rst`.

1. Open the relevant file
[`index.rst`](https://github.com/pytorch/tutorials/blob/master/index.rst)
[`index.rst`](https://github.com/pytorch/tutorials/blob/main/index.rst)
or
[`recipes_index.rst`](https://github.com/pytorch/tutorials/blob/master/recipes_source/recipes_index.rst)
[`recipes_index.rst`](https://github.com/pytorch/tutorials/blob/main/recipes_source/recipes_index.rst)
1. Add a _card_ in reStructuredText format similar to the following:

```
Expand Down Expand Up @@ -300,10 +300,10 @@ might fail to build, and the cards will not display properly.
### Image ###

Add a thumbnail to the
[`_static/img/thumbnails/cropped`](https://github.com/pytorch/tutorials/tree/master/_static/img/thumbnails/cropped)
[`_static/img/thumbnails/cropped`](https://github.com/pytorch/tutorials/tree/main/_static/img/thumbnails/cropped)
directory. Images that render the best are square--that is, they have
equal `x` and `y` dimensions--and also have high resolution. [Here is an
example](https://github.com/pytorch/tutorials/blob/master/_static/img/thumbnails/cropped/loading-data.PNG).
example](https://github.com/pytorch/tutorials/blob/main/_static/img/thumbnails/cropped/loading-data.PNG).

## `toctree` ##

Expand Down Expand Up @@ -344,7 +344,7 @@ test your tutorial when you submit your PR.
NOTE: Please do not use [ghstack](https://github.com/ezyang/ghstack). We
do not support ghstack in the [`pytorch/tutorials`](https://github.com/pytorch/tutorials) repo.

Submit the changes as a PR to the master branch of
Submit the changes as a PR to the main branch of
[`pytorch/tutorials`](https://github.com/pytorch/tutorials).

1. Add your changes, commit, and push:
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Here is how you can create a new tutorial (for a detailed description, see [CONT
1. Create a Python file. If you want it executed while inserted into documentation, save the file with the suffix `tutorial` so that the file name is `your_tutorial.py`.
2. Put it in one of the `beginner_source`, `intermediate_source`, `advanced_source` directory based on the level of difficulty. If it is a recipe, add it to `recipes_source`. For tutorials demonstrating unstable prototype features, add to the `prototype_source`.
2. For Tutorials (except if it is a prototype feature), include it in the `toctree` directive and create a `customcarditem` in [index.rst](./index.rst).
3. For Tutorials (except if it is a prototype feature), create a thumbnail in the [index.rst file](https://github.com/pytorch/tutorials/blob/master/index.rst) using a command like `.. customcarditem:: beginner/your_tutorial.html`. For Recipes, create a thumbnail in the [recipes_index.rst](https://github.com/pytorch/tutorials/blob/master/recipes_source/recipes_index.rst)
3. For Tutorials (except if it is a prototype feature), create a thumbnail in the [index.rst file](https://github.com/pytorch/tutorials/blob/main/index.rst) using a command like `.. customcarditem:: beginner/your_tutorial.html`. For Recipes, create a thumbnail in the [recipes_index.rst](https://github.com/pytorch/tutorials/blob/main/recipes_source/recipes_index.rst)

If you are starting off with a Jupyter notebook, you can use [this script](https://gist.github.com/chsasank/7218ca16f8d022e02a9c0deb94a310fe) to convert the notebook to Python file. After conversion and addition to the project, please make sure that section headings and other things are in logical order.

Expand Down
2 changes: 1 addition & 1 deletion advanced_source/generic_join.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Distributed Training with Uneven Inputs Using the Join Context Manager
**Author**\ : `Andrew Gu <https://github.com/andwgu>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/advanced_source/generic_join.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/advanced_source/generic_join.rst>`__.

.. note:: ``Join`` is introduced in PyTorch 1.10 as a prototype feature. This
API is subject to change.
Expand Down
2 changes: 1 addition & 1 deletion advanced_source/rpc_ddp_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Combining Distributed DataParallel with Distributed RPC Framework
**Authors**: `Pritam Damania <https://github.com/pritamdamania87>`_ and `Yi Wang <https://github.com/SciPioneer>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/advanced_source/rpc_ddp_tutorial.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/advanced_source/rpc_ddp_tutorial.rst>`__.

This tutorial uses a simple example to demonstrate how you can combine
`DistributedDataParallel <https://pytorch.org/docs/stable/nn.html#torch.nn.parallel.DistributedDataParallel>`__ (DDP)
Expand Down
2 changes: 1 addition & 1 deletion beginner_source/dist_overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ PyTorch Distributed Overview
**Author**: `Shen Li <https://mrshenli.github.io/>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/beginner_source/dist_overview.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/beginner_source/dist_overview.rst>`__.

This is the overview page for the ``torch.distributed`` package. The goal of
this page is to categorize documents into different topics and briefly
Expand Down
2 changes: 1 addition & 1 deletion beginner_source/former_torchies/parallelism_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ def forward(self, x):
# - `Discuss PyTorch on the Forums`_
# - `Chat with other users on Slack`_
#
# .. _`Deep Learning with PyTorch: a 60-minute blitz`: https://github.com/pytorch/tutorials/blob/master/Deep%20Learning%20with%20PyTorch.ipynb
# .. _`Deep Learning with PyTorch: a 60-minute blitz`: https://github.com/pytorch/tutorials/blob/main/Deep%20Learning%20with%20PyTorch.ipynb
# .. _Train a state-of-the-art ResNet network on imagenet: https://github.com/pytorch/examples/tree/master/imagenet
# .. _Train a face generator using Generative Adversarial Networks: https://github.com/pytorch/examples/tree/master/dcgan
# .. _Train a word-level language model using Recurrent LSTM networks: https://github.com/pytorch/examples/tree/master/word_language_model
Expand Down
2 changes: 1 addition & 1 deletion beginner_source/nn_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@

PATH.mkdir(parents=True, exist_ok=True)

URL = "https://github.com/pytorch/tutorials/raw/master/_static/"
URL = "https://github.com/pytorch/tutorials/raw/main/_static/"
FILENAME = "mnist.pkl.gz"

if not (PATH / FILENAME).exists():
Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/FSDP_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Getting Started with Fully Sharded Data Parallel(FSDP)
**Author**: `Hamid Shojanazeri <https://github.com/HamidShojanazeri>`__, `Yanli Zhao <https://github.com/zhaojuanmao>`__, `Shen Li <https://mrshenli.github.io/>`__

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/FSDP_tutorial.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/FSDP_tutorial.rst>`__.

Training AI models at a large scale is a challenging task that requires a lot of compute power and resources.
It also comes with considerable engineering complexity to handle the training of these very large models.
Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/ax_multiobjective_nas_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
# -----------------------
#
# Our goal is to optimize the PyTorch Lightning training job defined in
# `mnist_train_nas.py <https://github.com/pytorch/tutorials/tree/master/intermediate_source/mnist_train_nas.py>`__.
# `mnist_train_nas.py <https://github.com/pytorch/tutorials/tree/main/intermediate_source/mnist_train_nas.py>`__.
# To do this using TorchX, we write a helper function that takes in
# the values of the architcture and hyperparameters of the training
# job and creates a `TorchX AppDef <https://pytorch.org/torchx/latest/basics.html>`__
Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/ddp_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Getting Started with Distributed Data Parallel
**Edited by**: `Joe Zhu <https://github.com/gunandrose4u>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/ddp_tutorial.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/ddp_tutorial.rst>`__.

Prerequisites:

Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/dist_pipeline_parallel_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Distributed Pipeline Parallelism Using RPC
**Author**: `Shen Li <https://mrshenli.github.io/>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/dist_pipeline_parallel_tutorial.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/dist_pipeline_parallel_tutorial.rst>`__.

Prerequisites:

Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/dist_tuto.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Writing Distributed Applications with PyTorch
**Author**: `Séb Arnold <https://seba1511.com>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/dist_tuto.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/dist_tuto.rst>`__.

Prerequisites:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Customize Process Group Backends Using Cpp Extensions
**Author**: `Feng Tian <https://github.com/ftian1>`__, `Shen Li <https://mrshenli.github.io/>`__, `Min Si <https://minsii.github.io/>`__

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/process_group_cpp_extension_tutorial.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/process_group_cpp_extension_tutorial.rst>`__.

Prerequisites:

Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/rpc_async_execution.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Implementing Batch RPC Processing Using Asynchronous Executions
**Author**: `Shen Li <https://mrshenli.github.io/>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_async_execution.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/rpc_async_execution.rst>`__.

Prerequisites:

Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/rpc_param_server_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Implementing a Parameter Server Using Distributed RPC Framework
**Author**\ : `Rohan Varma <https://github.com/rohan-varma>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_param_server_tutorial.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/rpc_param_server_tutorial.rst>`__.

Prerequisites:

Expand Down
2 changes: 1 addition & 1 deletion intermediate_source/rpc_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Getting Started with Distributed RPC Framework
**Author**: `Shen Li <https://mrshenli.github.io/>`_

.. note::
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_tutorial.rst>`__.
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/main/intermediate_source/rpc_tutorial.rst>`__.

Prerequisites:

Expand Down
6 changes: 3 additions & 3 deletions prototype_source/README.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,15 +10,15 @@ Prototype Tutorials

3. graph_mode_dynamic_bert_tutorial.rst
Graph Mode Dynamic Quantization on BERT
https://github.com/pytorch/tutorials/blob/master/prototype_source/graph_mode_dynamic_bert_tutorial.rst
https://github.com/pytorch/tutorials/blob/main/prototype_source/graph_mode_dynamic_bert_tutorial.rst

4. numeric_suite_tutorial.py
PyTorch Numeric Suite Tutorial
https://github.com/pytorch/tutorials/blob/master/prototype_source/numeric_suite_tutorial.py
https://github.com/pytorch/tutorials/blob/main/prototype_source/numeric_suite_tutorial.py

5. torchscript_freezing.py
Model Freezing in TorchScript
https://github.com/pytorch/tutorials/blob/master/prototype_source/torchscript_freezing.py
https://github.com/pytorch/tutorials/blob/main/prototype_source/torchscript_freezing.py

6. vulkan_workflow.rst
Vulkan Backend User Workflow
Expand Down