Skip to content

nan gradients with BCHW layout #13

Open
@jj0mst

Description

@jj0mst

I recently tried to use the new BCHW functions with my network, since i always use that layout and it simplifies my code.

I noticed that all the gradients of my convolutional layers are nan now, which makes also the weights full of nan after the parameters' update.

I'm sure i made all the necessary conversions and i get no error like inconsistency between tensors or anything else.

Metadata

Metadata

Assignees

Labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions