Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[MXNET-37] tutorial for distributed training (apache#9152)
* initial draft * WIP tutorial * WIP tutorial, with mnist script changes Signed-off-by: Rahul <rahulhuilgol@gmail.com> * WIP tutorial, with mnist script changes Signed-off-by: Rahul <rahulhuilgol@gmail.com> * use logger Signed-off-by: Rahul <rahulhuilgol@gmail.com> * remove from old page Signed-off-by: Rahul <rahulhuilgol@gmail.com> * first draft of tutorial and removing pythonpath inserts for get_data, by moving them to test_utils Signed-off-by: Rahul <rahulhuilgol@gmail.com> * fix typos Signed-off-by: Rahul <rahulhuilgol@gmail.com> * rename functions Signed-off-by: Rahul <rahulhuilgol@gmail.com> * small change in section heading Signed-off-by: Rahul <rahulhuilgol@gmail.com> * fix reimport Signed-off-by: Rahul <rahulhuilgol@gmail.com> * Update distributed_training.md * Update distributed_training.md Punctuation and minor changes * fix gluon iterators and address some review comments Signed-off-by: Rahul <rahulhuilgol@gmail.com> * Update multi_devices.md * Update distributed_training.md indentation change * Update distributed_training.md cmake instruction * retain only doc changes * comments addressed * fix link of gradient compression page * clarifying launch.py usage * update env var info * update broken links * update comment on splitting data
- Loading branch information