Skip to content

Remove softmax classification layer and compute gradient #5

@Jamesswiz

Description

@Jamesswiz

Hi,
I want to remove the softmax layer and then compute gradients using guided backpropogation. I used pop() for this:

model=load_model('nnet.h5', custom_objects={'keras':keras})
model.layers.pop()

However, I am getting the following error while using:
from guided_backprop import GuidedBackprop
guided_bprop = GuidedBackprop(model)

  Error: The name 'activation_5_1/Softmax:0' refers to a Tensor which does not exist.

I guess pop() does not changes the TF network graph. Can you please help how do I handle this?

Q: does your code ensures that the gradients are with respect to the output node with maximum activation?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions