Skip to content

Commit d81e832

Browse files
Remove skip_init tutorial (#3562)
Fixes [T237454734](https://www.internalfb.com/intern/tasks/?t=237454734) skip_init was originally added to support skipping initialization during load_state_dict, see pytorch/pytorch#29523 This has been superseded by model creation under the meta device context manager and then loading with assign=True, see https://docs.pytorch.org/tutorials/recipes/recipes/module_load_state_dict_tips.html#using-torch-device-meta So this tutorial can be removed --------- Co-authored-by: Svetlana Karslioglu <svekars@meta.com>
1 parent 91a1891 commit d81e832

File tree

3 files changed

+1
-136
lines changed

3 files changed

+1
-136
lines changed

redirects.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,4 +38,5 @@
3838
"recipes/recipes_index.html": "../recipes_index.html",
3939
"recipes/torchserve_vertexai_tutorial.html": "../index.html",
4040
"unstable_source/vulkan_workflow.rst": "../index.html",
41+
"unstable/skip_param_init.html": "https://docs.pytorch.org/tutorials/recipes/recipes/module_load_state_dict_tips.html",
4142
}

unstable_index.rst

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -45,15 +45,6 @@ decide if we want to upgrade the level of commitment or to fail fast.
4545
:link: unstable/semi_structured_sparse.html
4646
:tags: Model-Optimiziation
4747

48-
.. Modules
49-
50-
.. customcarditem::
51-
:header: Skipping Module Parameter Initialization in PyTorch 1.10
52-
:card_description: Describes skipping parameter initialization during module construction in PyTorch 1.10, avoiding wasted computation.
53-
:image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
54-
:link: unstable/skip_param_init.html
55-
:tags: Modules
56-
5748
.. vmap
5849
5950
.. customcarditem::

unstable_source/skip_param_init.rst

Lines changed: 0 additions & 127 deletions
This file was deleted.

0 commit comments

Comments
 (0)