Skip to content

Sigmoid and softmax activation functions for brulee_mlp() #69

@christophscheuch

Description

@christophscheuch

Feature

Currently, brulee_mlp() supports "relu", "elu", "tanh", and "linear" activation functions. Would be great to also have "sigmoid" and "softmax" as they are also supported via torch::nn_sigmoid() and torch::nn_softmax(). I assume it just a few additional lines in the get_activation_fn(): https://github.com/tidymodels/brulee/blob/087129b0a71e63f16137934f89091b4db7fa4351/R/mlp-fit.R#L865C22-L865C22

Metadata

Metadata

Assignees

No one assigned

    Labels

    featurea feature request or enhancement

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions