Skip to content

Encoder and decoder should not share weights #7

Open
@ufgtb24

Description

@ufgtb24

I have a question about the code here when I have read it. I think the line 181:
https://github.com/cmgreen210/TensorFlowDeepAutoencoder/blob/5298ec437689ba7ecb59229599141549ef6a6a1d/code/ae/autoencoder.py#L181-L182
out = self._activate(last_output, self._w(n), self._b(n, "_out"),
transpose_w=True)
the second param shouldn't be 'self._w(n)', because this variable is already used for the encoder weights, here we need a trainable decoder weights, and encoder and decoder shouldn't share weights.

would you please give some clue about it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions