File tree Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Original file line number Diff line number Diff line change @@ -70,10 +70,10 @@ We can distribute and run this function (e.g. on 2 machines x 2 GPUs) using **`t
70
70
71
71
``` python
72
72
import logging
73
- logging.basicConfig(level = logging.INFO )
74
-
75
73
import torchrunx
76
74
75
+ logging.basicConfig(level = logging.INFO )
76
+
77
77
launcher = torchrunx.Launcher(
78
78
hostnames = [" localhost" , " second_machine" ], # or IP addresses
79
79
workers_per_host = 2 # e.g. number of GPUs per host
@@ -93,7 +93,7 @@ trained_model: nn.Module = results.rank(0)
93
93
# or: results.index(hostname="localhost", local_rank=0)
94
94
95
95
# and continue your script
96
- torch.save(trained_model.state_dict(), " output /model.pth" )
96
+ torch.save(trained_model.state_dict(), " outputs /model.pth" )
97
97
```
98
98
99
99
** See more examples where we fine-tune LLMs using:**
You can’t perform that action at this time.
0 commit comments