Skip to content
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,8 @@ To release a new version, please update the changelog as followed:
- CI Tool:
- Danger CI has been added to enforce the update of the changelog (by @lgarithm and @DEKHTIARJonathan in #563)
- https://github.com/apps/stale/ added to clean stale issues (by @DEKHTIARJonathan in #573)
- API:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change for Layer

- Add new layer ElementwiseLambdaLayer (by @One-sixth in #576)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incorrect PR number => #579

Please avoid using 'add of blablabla'. We already know you added something thanks to the Changelog Section.

Make smthg like this:


### Changed
- Tensorflow CPU & GPU dependencies moved to separated requirement files in order to allow PyUP.io to parse them (by @DEKHTIARJonathan in #573)
Expand All @@ -94,7 +96,7 @@ To release a new version, please update the changelog as followed:
### Dependencies Update

### Contributors
@lgarithm @DEKHTIARJonathan @2wins
@lgarithm @DEKHTIARJonathan @2wins @One-sixth


## [1.8.5] - 2018-05-09
Expand Down
58 changes: 58 additions & 0 deletions tensorlayer/layers/merge.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
__all__ = [
'ConcatLayer',
'ElementwiseLayer',
'ElementwiseLambdaLayer',
]


Expand Down Expand Up @@ -150,3 +151,60 @@ def __init__(
# # self.all_drop = list_remove_repeat(self.all_drop)

self.all_layers.append(self.outputs)


class ElementwiseLambdaLayer(Layer):
"""A layer uses a custom function to join multiple layer inputs.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

English Syntax: A layer that use a custom function to combine multiple :class:Layer inputs.


Parameters
----------
layers : list of :class:`Layer`
The list of layers to combine.
fn : function
The function that applies to the outputs of previous layer.
fn_args : dictionary or None
The arguments for the function (option).
name : str
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing:

act : activation function
        The activation function of this layer.

A unique layer name.

Examples
--------
z = mean + noise * tf.exp(std * 0.5)

>>> def func(noise, mean, std):
>>> return mean + noise * tf.exp(std * 0.5)
>>> x = tf.placeholder(tf.float32, [None, 200])
>>> noise_tensor = tf.random_normal(tf.stack([tf.shape(x)[0], 200]))
>>> noise = tl.layers.InputLayer(noise_tensor)
>>> net = tl.layers.InputLayer(x)
>>> net = tl.layers.DenseLayer(net, n_units=200, act=tf.nn.relu, name='dense1')
>>> mean = tl.layers.DenseLayer(net, n_units=200, name='mean')
>>> std = tl.layers.DenseLayer(net, n_units=200, name='std')
>>> z = tl.layers.ElementwiseLambdaLayer([noise, mean, std], fn=func, name='z')
"""

def __init__(
self,
layers,
fn,
fn_args=None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please add parameter

  • act=None

act=None,
name='elementwiselambda_layer',
):

super(ElementwiseLambdaLayer, self).__init__(prev_layer=layers, name=name)
logging.info("ElementwiseLambdaLayer %s" % self.name)

if fn_args is None:
fn_args = {}

self.inputs = [layer.outputs for layer in layers]

with tf.variable_scope(name) as vs:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a linebreak before the new scope

self.outputs = fn(*self.inputs, **fn_args)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.outputs = fn(*self.inputs, **fn_args)
if act:
    self.outputs = act(self.outputs)

if act:
self.outputs = act(self.outputs)
variables = tf.get_collection(TF_GRAPHKEYS_VARIABLES, scope=vs.name)

self.all_layers.append(self.outputs)
self.all_params.extend(variables)