Skip to content

How could I put togeter ELU or more advanced activation operation using tensorflow core #294

Open
@Cr33zz

Description

@Cr33zz

I'm trying to use this library as computational graph only in my neural network implementation. I'm wondering how should I implement functions like ELU or softmax given I have an input Tensor?

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions