Closed
Description
Aloha!
I was testing out pytorch-pix2pix on sagemaker, and i noticed that training doesnt run faster if i run it on a machine with 8 GPUs or with 1. I do see the correct option gpu_ids: 0,1,2,3,4,5,6,7 #011[default: 0]
in the output, so that looks ok'ish.
Any ideas where i can start? In this article https://medium.com/@julsimon/training-with-pytorch-on-amazon-sagemaker-58fca8c69987 it just says Multi-GPU training is also possible but requires extra work
but doesnt say what needs to be done.
Metadata
Metadata
Assignees
Labels
No labels