Skip to content

Code sample for basic mark sharding doesn't work #9178

@yaoshiang

Description

@yaoshiang

📚 Documentation

This document:

https://docs.pytorch.org/xla/master/learn/api-guide.html#module-torch_xla.distributed.spmd

has an important code sample to demonstrate sharding tensors across devices. It doesn't work - there are imports and setup that are not included.

More broadly, all of these samples should go into a larger guide that gently walks a user through the process of understanding how PT/XLA handles multi-device and multi-host up through gSPMD. It's very elegant and powerful, but poorly documented.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions