Skip to content

Commit 37f8061

Browse files
authored
Update 2022-3-6-introducing-pytorch-fully-sharded-data-parallel-api.md
Update image width. Addition of authors. Removal of tutorial link
1 parent a17e82f commit 37f8061

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

_posts/2022-3-6-introducing-pytorch-fully-sharded-data-parallel-api.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
layout: blog_detail
33
title: "Introducing PyTorch Fully Sharded Data Parallel (FSDP) API"
4-
author: Yanli Zhao
4+
author: Yanli Zhao, Rohan Varma, Chien-Chin Huang, Shen Li, Min Xu, Alban Desmaison
55
featured-img: ""
66
---
77

@@ -27,7 +27,7 @@ The figure below shows how FSDP works for 2 data-parallel processes:
2727

2828

2929
<p align="center">
30-
<img src="/assets/images/fsdp_workflow.png" width="60%">
30+
<img src="/assets/images/fsdp_workflow.png" width="100%">
3131
</p>
3232

3333
<p align = "center">
@@ -38,7 +38,7 @@ Usually, model layers are wrapped with FSDP in a nested way, so that only layers
3838

3939
### Using FSDP in PyTorch
4040

41-
There are two ways to wrap a model with PyTorch FSDP. Auto wrapping is a drop-in replacement for DDP; manual wrapping needs minimal changes of model definition code with the ability to explore complex sharding strategies. Please see more details in this [tutorial](http://www.google.com).
41+
There are two ways to wrap a model with PyTorch FSDP. Auto wrapping is a drop-in replacement for DDP; manual wrapping needs minimal changes of model definition code with the ability to explore complex sharding strategies.
4242

4343

4444
# Auto Wrapping

0 commit comments

Comments
 (0)