Skip to content

Option to use SELU for the activation layer in mlp via keras #1127

Closed
@obgeneralao

Description

@obgeneralao

Is it possible to add "selu" activation function for Multilayer perceptron via keras?
How to use "Adamax" optimizer instead of the default "Adam"?

I am using keras==2.15 and tensorflow==2.15 and parsnip==1.2.1.

Thanks a lot.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions