-
Notifications
You must be signed in to change notification settings - Fork 30
Open
Description
Hi,
I want to remove the softmax layer and then compute gradients using guided backpropogation. I used pop() for this:
model=load_model('nnet.h5', custom_objects={'keras':keras})
model.layers.pop()
However, I am getting the following error while using:
from guided_backprop import GuidedBackprop
guided_bprop = GuidedBackprop(model)
Error: The name 'activation_5_1/Softmax:0' refers to a Tensor which does not exist.
I guess pop() does not changes the TF network graph. Can you please help how do I handle this?
Q: does your code ensures that the gradients are with respect to the output node with maximum activation?
Metadata
Metadata
Assignees
Labels
No labels