## Feature Currently, `brulee_mlp()` supports "relu", "elu", "tanh", and "linear" activation functions. Would be great to also have "sigmoid" and "softmax" as they are also supported via `torch::nn_sigmoid()` and `torch::nn_softmax()`. I assume it just a few additional lines in the `get_activation_fn()`: https://github.com/tidymodels/brulee/blob/087129b0a71e63f16137934f89091b4db7fa4351/R/mlp-fit.R#L865C22-L865C22