Skip to content

GELU does not appear to support approximate tanh #1368

Open
@travisjj

Description

@travisjj

The optional algorithm for GELU is to internally use tanh

See more here:
https://pytorch.org/docs/stable/generated/torch.nn.GELU.html#torch.nn.GELU

I was expecting this to just work:

var gelu = nn.GELU(approximate: "tanh");

When the approximate argument is ‘tanh’, GELU is estimated differently. The default is rather different.

Is it possible, since this is supported natively, to include the "approximate" property for TorchSharp's GELU?

Is there a way for me to do it without requiring the difficulty of pushing new versions of the library?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions