You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Starting with release 0.96.9, you can load TorchScript modules and functions that have been either traced or scripted in Pytorch. It is, however, not yet possible to create a TorchScript module from scratch using TorchSharp. Refer to the [Pytorch JIT](https://pytorch.org/docs/stable/jit.html) docs for information on how to create such a file.
4
+
5
+
TorchScript is very powerful, because it allows you to save the logic and the weights of a model together, and it furthermore allows the module to be loaded into another program, __without any dependencies on the Python runtime.__ Thus, you can load a model that has been serialized using TorchScript and have it behave as any TorchScript module -- you can use it for training, or you can use it for inference.
6
+
7
+
Once you have a TorchScript file, you can load it into TorchSharp using:
8
+
9
+
```C#
10
+
varm=torch.jit.load("file-name");
11
+
```
12
+
13
+
It returns a ScriptModule, which behaves just like any other TorchSharp module. Whether the original script came from a module or a function, it is deserialized as a module. You can use it for training of inference by calling either `train()` or `eval()`. ScriptModules always start out on the CPU, so you have to call `cuda()` in order to move it to a GPU.
14
+
15
+
Note that if you used __tracing__ to create the TorchScript file in Pytorch, submodules that behave differently in training and eval modes will behave according to the mode they were traced in.
16
+
17
+
If you use the script module to train, you may want / need to save it afterwards.
18
+
19
+
That is easily done using `save()`:
20
+
21
+
```C#
22
+
torch.jit.save(m, "file-name");
23
+
```
24
+
25
+
While it is possible to save a modified ScriptModule from TorchSharp, it is not (yet) possible to create one _from scratch_ using either tracing or scripting. Another limitation is that the TorchSharp code assumes that the `forward()` function takes only tensors as its arguments and returns a single tensor, a limitation it shares with other TorchSharp modules.
0 commit comments