-
Notifications
You must be signed in to change notification settings - Fork 876
new api of elu gelu relu logsigmoid, test=develop #2407
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 4 commits
47ef48e
a568dc5
9d39f0d
8954995
8ff14da
71a3978
2bb37aa
fd5c057
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
This file was deleted.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,41 @@ | ||
| .. _cn_api_nn_ELU: | ||
|
|
||
| ELU | ||
| ------------------------------- | ||
| .. py:class:: paddle.nn.ELU(x, alpha=1.0, name=None) | ||
| ELU激活层(ELU Activation Operator) | ||
|
|
||
| 根据 https://arxiv.org/abs/1511.07289 对输入Tensor中每个元素应用以下计算。 | ||
|
||
|
|
||
| .. math:: | ||
| ELU(x) = max(0, x) + min(0, \alpha * (e^{x} − 1)) | ||
| 其中,:math:`x` 为输入的 Tensor | ||
|
|
||
| 参数 | ||
| :::::::::: | ||
| - alpha (float, 可选) - ELU的alpha值,默认值为1.0。 | ||
| - name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。 | ||
|
|
||
| 形状: | ||
| :::::::::: | ||
| - input: 任意形状的Tensor。 | ||
| - output: 和input具有相同形状的Tensor。 | ||
|
|
||
| 代码示例 | ||
| ::::::::: | ||
|
|
||
| .. code-block:: python | ||
| import paddle | ||
| import numpy as np | ||
| paddle.disable_static() | ||
| x = paddle.to_tensor(np.array([[-1,6],[1,15.6]])) | ||
|
||
| m = paddle.nn.ELU(0.2) | ||
| out = m(x) | ||
| # [[-0.12642411 6. ] | ||
| # [ 1. 15.6 ]] | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,54 @@ | ||
| .. _cn_api_fluid_layers_gelu: | ||
|
||
|
|
||
| GELU | ||
| ------------------------------- | ||
| .. py:class:: paddle.nn.GELU(approximate=False, name=None) | ||
| GELU激活层(GELU Activation Operator) | ||
|
|
||
| 更多细节请参考 `Gaussian Error Linear Units <https://arxiv.org/abs/1606.08415>`。 | ||
|
|
||
| 如果使用近似计算: | ||
|
|
||
| .. math:: | ||
| GELU(x) = 0.5 * x * (1 + tanh(\sqrt{\frac{2}{\pi}} * (x + 0.044715x^{3}))) | ||
| 如果不使用近似计算: | ||
|
|
||
| .. math:: | ||
| GELU(x) = 0.5 * x * (1 + erf(\frac{x}{\sqrt{2}})) | ||
| 其中,:math:`x` 为输入的 Tensor | ||
|
|
||
| 参数 | ||
| :::::::::: | ||
| - approximate (bool, 可选) - 是否使用近似计算,默认值为 False。 | ||
|
||
| - name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。 | ||
|
|
||
| 形状: | ||
| :::::::::: | ||
| - input: 任意形状的Tensor。 | ||
| - output: 和input具有相同形状的Tensor。 | ||
|
|
||
| 代码示例 | ||
| ::::::::: | ||
|
|
||
| .. code-block:: python | ||
| import paddle | ||
| import numpy as np | ||
| paddle.disable_static() | ||
| data = np.random.randn(2, 3).astype("float32") | ||
| x = paddle.to_tensor(data) | ||
|
||
| m = paddle.nn.GELU() | ||
| out = m(x) | ||
| data | ||
| # array([[ 0.87165993, -1.0541513 , -0.37214822], | ||
| # [ 0.15647964, 0.32496083, 0.33045998]], dtype=float32) | ||
| out | ||
| # array([[ 0.70456535, -0.15380788, -0.13207214], | ||
| # [ 0.08796856, 0.20387867, 0.2080159 ]], dtype=float32) | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| .. _cn_api_nn_LogSigmoid: | ||
|
|
||
| LogSigmoid | ||
| ------------------------------- | ||
| .. py:class:: paddle.nn.LogSigmoid(x, name=None) | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 这里也没有【x】
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done |
||
| Logsigmoid激活层。计算公式如下: | ||
|
|
||
| .. math:: | ||
| Logsigmoid(x) = \log \frac{1}{1 + e^{-x}} | ||
| 其中,:math:`x` 为输入的 Tensor | ||
|
|
||
| 参数 | ||
| :::::::::: | ||
| - name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。 | ||
|
|
||
| 形状: | ||
| :::::::::: | ||
| - input: 任意形状的Tensor。 | ||
| - output: 和input具有相同形状的Tensor。 | ||
|
|
||
| 代码示例 | ||
| ::::::::: | ||
|
|
||
| .. code-block:: python | ||
| import paddle | ||
| import numpy as np | ||
| paddle.disable_static() | ||
| x = paddle.to_tensor(np.array([1.0, 2.0, 3.0, 4.0])) | ||
| m = paddle.nn.LogSigmoid() | ||
| out = m(x) # [0.7310586, 0.880797, 0.95257413, 0.98201376] | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| .. _cn_api_nn_ReLU: | ||
|
|
||
| ELU | ||
|
||
| ------------------------------- | ||
| .. py:class:: paddle.nn.ReLU(x, name=None) | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 这里也没有 【x】
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done |
||
| ReLU激活层(Rectified Linear Unit)。计算公式如下: | ||
|
|
||
| .. math:: | ||
| ReLU(x) = max(0, x) | ||
| 其中,:math:`x` 为输入的 Tensor | ||
|
|
||
| 参数 | ||
| :::::::::: | ||
| - name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。 | ||
|
|
||
| 形状: | ||
| :::::::::: | ||
| - input: 任意形状的Tensor。 | ||
| - output: 和input具有相同形状的Tensor。 | ||
|
|
||
| 代码示例 | ||
| ::::::::: | ||
|
|
||
| .. code-block:: python | ||
| import paddle | ||
| import numpy as np | ||
| paddle.disable_static() | ||
| x = paddle.to_tensor(np.array([-2, 0, 1]).astype('float32')) | ||
| m = paddle.nn.ReLU() | ||
| out = m(x) # [0., 0., 1.] | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -2,6 +2,43 @@ | |
|
|
||
| elu | ||
| ------------------------------- | ||
| :doc_source: paddle.fluid.layers.elu | ||
|
|
||
| .. py:function:: paddle.nn.functional.elu(x, alpha=1.0, name=None) | ||
| elu激活层(ELU Activation Operator) | ||
|
|
||
| 根据 https://arxiv.org/abs/1511.07289 对输入Tensor中每个元素应用以下计算。 | ||
|
|
||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 根据
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done |
||
| .. math:: | ||
| elu(x) = max(0, x) + min(0, \alpha * (e^{x} − 1)) | ||
| 其中,:math:`x` 为输入的 Tensor | ||
|
|
||
| 参数: | ||
| :::::::::: | ||
| - x (Tensor) - 输入的 ``Tensor`` ,数据类型为:float32、float64。 | ||
| - alpha (float, 可选) - ELU的alpha值,默认值为1.0。 | ||
| - name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。 | ||
|
|
||
| 返回 | ||
| :::::::::: | ||
| ``Tensor`` ,数据类型和形状同 ``x`` 一致。 | ||
|
|
||
| 代码示例 | ||
| :::::::::: | ||
|
|
||
| .. code-block:: python | ||
| import paddle | ||
| import paddle.nn.functional as F | ||
| import numpy as np | ||
| paddle.disable_static() | ||
| x = paddle.to_tensor(np.array([[-1,6],[1,15.6]])) | ||
| out = F.elu(x, alpha=0.2) | ||
| # [[-0.12642411 6. ] | ||
| # [ 1. 15.6 ]] | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -2,6 +2,54 @@ | |
|
|
||
| gelu | ||
| ------------------------------- | ||
| :doc_source: paddle.fluid.layers.gelu | ||
|
|
||
| .. py:function:: paddle.nn.functional.gelu(x, approximate=False, name=None) | ||
| gelu激活层(GELU Activation Operator) | ||
|
|
||
| 逐元素计算 Gelu激活函数。更多细节请参考 `Gaussian Error Linear Units <https://arxiv.org/abs/1606.08415>`_ 。 | ||
|
|
||
| 如果使用近似计算: | ||
|
|
||
| .. math:: | ||
| gelu(x) = 0.5 * x * (1 + tanh(\sqrt{\frac{2}{\pi}} * (x + 0.044715x^{3}))) | ||
| 如果不使用近似计算: | ||
|
|
||
| .. math:: | ||
| gelu(x) = 0.5 * x * (1 + erf(\frac{x}{\sqrt{2}})) | ||
| 其中,:math:`x` 为输入的 Tensor | ||
|
|
||
| 参数: | ||
| :::::::::: | ||
| - x (Tensor) - 输入的 ``Tensor`` ,数据类型为:float32、float64。 | ||
| - approximate (bool, 可选) - 是否使用近似计算,默认值为 False。 | ||
|
||
| - name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。 | ||
|
|
||
| 返回 | ||
| :::::::::: | ||
| ``Tensor`` ,数据类型和形状同 ``x`` 一致。 | ||
|
|
||
| 代码示例 | ||
| :::::::::: | ||
|
|
||
| .. code-block:: python | ||
| import paddle | ||
| import paddle.nn.functional as F | ||
| import numpy as np | ||
| paddle.disable_static() | ||
| data = np.random.randn(2, 3).astype("float32") | ||
| x = paddle.to_tensor(data) | ||
| out = F.gelu(x) | ||
| data | ||
| # array([[ 0.87165993, -1.0541513 , -0.37214822], | ||
| # [ 0.15647964, 0.32496083, 0.33045998]], dtype=float32) | ||
| out | ||
| # array([[ 0.70456535, -0.15380788, -0.13207214], | ||
| # [ 0.08796856, 0.20387867, 0.2080159 ]], dtype=float32) | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
【ELU】的参数里没有【x】
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done