-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added an AlexNet wrapper using LRP layer definitions. #1
base: master
Are you sure you want to change the base?
Conversation
…xpected with older code)
…port in convs); also add __init__.py files to allow for adding interprettensor to /data/ruthfong/tensorflow/:/data/ruthfong/tensorflow/models/slim/nets:/data/ruthfong/tensorflow/models:/data/ruthfong/tensorflow/cleverhans:/data/ruthfong/tensorflow/interprettensor::/users/ruthfong/sample_code/Caffe-ExcitationBP/python and then importing sub-folders as modules.
…sn't work (doberman.png), and helper file with imagenet classes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi,
Thanks for your contribution!
I have recently added for each layer - a possibility to initialize the weights.
So your convolution.py file, line 35: only groups=1 can stay.
self.strides = [1,self.stride_size, self.stride_size,1] | ||
with tf.variable_scope(self.name): | ||
self.weights = variables.weights(self.weights_shape) | ||
self.biases = variables.biases(self.output_depth) | ||
if self.weights is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, I am not exactly sure why you are doing this.
Though, the recent update for weight initialization should take care of this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Before, I wasn't able to set the weights to pre-trained weights after initialization, so I moved that logic to the point where we directly initialize the weights Variable. I'll take a look at the recent update though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure if having 'group' kernels helps in any way. The original AlexNet paper did it when they had problems with memory while computing.
BVLC/caffe#778
modules/convolution.py was modified to support a) multiple kernel groups, per AlexNet's original architecture, and b) initialization of weights/biases with passed-in arrays (older code should still work).
models/alexnet.py allows pre-trained weights to be loaded.
(Weights can be downloaded from http://www.cs.toronto.edu/~guerzhoy/tf_alexnet/bvlc_alexnet.npy)
Note: This implementation is based on https://github.com/guerzh/tf_weights, which doesn't work perfectly / seems to do well on some images but not others,
as noted at https://github.com/guerzh/tf_weights/issues/.
examples/alexnet_demo.py
I've included demo code and two example images that demonstrate a correctly behaving (examples/poodle.png) and incorrectly behaving (examples/doberman.png) classification result.
__init__.py
andmodels/__init__.py
This allows
/path/to/interprettensor
to be exported to PYTHONPATH and modules and models to be called as done in examples/alexnet_demo.pyExample:
export PYTHONPATH="/data/ruthfong/tensorflow/interprettensor":$PYTHONPATH