Open
Description
The optional algorithm for GELU is to internally use tanh
See more here:
https://pytorch.org/docs/stable/generated/torch.nn.GELU.html#torch.nn.GELU
I was expecting this to just work:
var gelu = nn.GELU(approximate: "tanh");
When the approximate argument is ‘tanh’, GELU is estimated differently. The default is rather different.
Is it possible, since this is supported natively, to include the "approximate" property for TorchSharp's GELU?
Is there a way for me to do it without requiring the difficulty of pushing new versions of the library?
Metadata
Metadata
Assignees
Labels
No labels