Skip to content
This repository was archived by the owner on Sep 19, 2022. It is now read-only.

Change Distributed Data Parallel example #124

Merged
merged 2 commits into from
Jan 11, 2019

Conversation

andreyvelich
Copy link
Member

@andreyvelich andreyvelich commented Jan 10, 2019

Fixes: #122.
I changed Distributed Data Parallel example. Right now I am using Pytorch library for DDP.
We can't use mpi backend for Distributed Data Parallel if we use GPU. I changed it to the nccl backend, as noted here.


This change is Reviewable

@andreyvelich
Copy link
Member Author

@johnugeorge

@coveralls
Copy link

Coverage Status

Coverage remained the same at 73.269% when pulling fc0ff0b on andreyvelich:122-change-ddp-example into 306edb5 on kubeflow:master.

@johnugeorge
Copy link
Member

/lgtm
/approve

@k8s-ci-robot
Copy link

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: johnugeorge

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@k8s-ci-robot k8s-ci-robot merged commit 9261b60 into kubeflow:master Jan 11, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants