Skip to content

Commit

Permalink
[docs] Typo - fixes ray-project#22761 (ray-project#22763)
Browse files Browse the repository at this point in the history
Signed-off-by: Max Pumperla <max.pumperla@googlemail.com>
  • Loading branch information
maxpumperla authored Mar 2, 2022
1 parent a9bf5e9 commit d53d0e0
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion doc/source/ray-overview/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ All you have to do is use the ``ray.train.torch.prepare_model`` and
``ray.train.torch.prepare_data_loader`` utility functions to
easily setup your model & data for distributed training.
This will automatically wrap your model with ``DistributedDataParallel``
and place it on the right device, and add ``DisributedSampler`` to your DataLoaders.
and place it on the right device, and add ``DistributedSampler`` to your DataLoaders.
```{literalinclude} /../../python/ray/train/examples/torch_quick_start.py
:language: python
Expand Down
2 changes: 1 addition & 1 deletion doc/source/train/train.rst
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ system. Let's take following simple examples:
``ray.train.torch.prepare_data_loader`` utility functions to
easily setup your model & data for distributed training.
This will automatically wrap your model with ``DistributedDataParallel``
and place it on the right device, and add ``DisributedSampler`` to your DataLoaders.
and place it on the right device, and add ``DistributedSampler`` to your DataLoaders.

.. literalinclude:: /../../python/ray/train/examples/torch_quick_start.py
:language: python
Expand Down

0 comments on commit d53d0e0

Please sign in to comment.