Skip to content

hardtanh prelu softmax, test=develop #2425

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 11 commits into from
Aug 21, 2020
3 changes: 3 additions & 0 deletions doc/fluid/api/nn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ paddle.nn
nn/grid_sampler.rst
nn/GroupNorm.rst
nn/hardshrink.rst
nn/hardtanh.rst
nn/hard_sigmoid.rst
nn/hard_swish.rst
nn/hash.rst
Expand Down Expand Up @@ -104,12 +105,14 @@ paddle.nn
nn/polynomial_decay.rst
nn/Pool2D.rst
nn/pool3d.rst
nn/prelu.rst
nn/prior_box.rst
nn/prroi_pool.rst
nn/psroi_pool.rst
nn/random_crop.rst
nn/rank_loss.rst
nn/ReLU.rst
nn/relu.rst
nn/relu6.rst
nn/resize_bilinear.rst
nn/resize_nearest.rst
Expand Down
3 changes: 3 additions & 0 deletions doc/fluid/api/nn/activation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,5 +8,8 @@ activation
activation/ELU.rst
activation/GELU.rst
activation/Hardshrink.rst
activation/Hardtanh.rst
activation/PReLU.rst
activation/ReLU.rst
activation/LogSigmoid.rst
activation/Softmax.rst
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/activation/ELU.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_activation_ELU:

ELU
-------------------------------

.. autoclass:: paddle.nn.ELU
:noindex:
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/activation/GELU.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_activation_GELU:

GELU
-------------------------------

.. autoclass:: paddle.nn.GELU
:noindex:
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/activation/Hardtanh.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_activation_Hardtanh:

Hardtanh
-------------------------------

.. autoclass:: paddle.nn.Hardtanh
:noindex:
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/activation/LogSigmoid.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_activation_LogSigmoid:

LogSigmoid
-------------------------------

.. autoclass:: paddle.nn.LogSigmoid
:noindex:
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/activation/PReLU.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_activation_PReLU:

PReLU
-------------------------------

.. autoclass:: paddle.nn.PReLU
:noindex:
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/activation/ReLU.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_activation_ReLU:

ReLU
-------------------------------

.. autoclass:: paddle.nn.ReLU
:noindex:
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/activation/Softmax.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_activation_Softmax:

Softmax
-------------------------------

.. autoclass:: paddle.nn.Softmax
:noindex:
4 changes: 3 additions & 1 deletion doc/fluid/api/nn/elu.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

elu
-------------------------------
:doc_source: paddle.fluid.layers.elu

.. autofunction:: paddle.nn.functional.elu
:noindex:


4 changes: 3 additions & 1 deletion doc/fluid/api/nn/gelu.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

gelu
-------------------------------
:doc_source: paddle.fluid.layers.gelu

.. autofunction:: paddle.nn.functional.gelu
:noindex:


7 changes: 7 additions & 0 deletions doc/fluid/api/nn/hardtanh.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_hardtanh:

hardtanh
-------------------------------

.. autofunction:: paddle.nn.functional.hardtanh
:noindex:
4 changes: 3 additions & 1 deletion doc/fluid/api/nn/logsigmoid.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

logsigmoid
-------------------------------
:doc_source: paddle.fluid.layers.logsigmoid

.. autofunction:: paddle.nn.functional.logsigmoid
:noindex:


7 changes: 7 additions & 0 deletions doc/fluid/api/nn/prelu.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_prelu:

prelu
-------------------------------

.. autofunction:: paddle.nn.functional.prelu
:noindex:
7 changes: 7 additions & 0 deletions doc/fluid/api/nn/relu.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _api_nn_relu:

relu
-------------------------------

.. autofunction:: paddle.nn.functional.relu
:noindex:
5 changes: 1 addition & 4 deletions doc/fluid/api/nn/softmax.rst
Original file line number Diff line number Diff line change
@@ -1,10 +1,7 @@
.. THIS FILE IS GENERATED BY `gen_doc.{py|sh}`
!DO NOT EDIT THIS FILE MANUALLY!

.. _api_nn_softmax:

softmax
-------
-------------------------------

.. autofunction:: paddle.nn.functional.softmax
:noindex:
Expand Down
2 changes: 2 additions & 0 deletions doc/fluid/api_cn/nn_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,7 @@ paddle.nn
nn_cn/grid_sampler_cn.rst
nn_cn/GroupNorm_cn.rst
nn_cn/hardshrink_cn.rst
nn_cn/hardtanh_cn.rst
nn_cn/hard_sigmoid_cn.rst
nn_cn/hard_swish_cn.rst
nn_cn/hash_cn.rst
Expand Down Expand Up @@ -116,6 +117,7 @@ paddle.nn
nn_cn/pool2d_cn.rst
nn_cn/Pool2D_cn.rst
nn_cn/pool3d_cn.rst
nn_cn/prelu_cn.rst
nn_cn/prior_box_cn.rst
nn_cn/prroi_pool_cn.rst
nn_cn/psroi_pool_cn.rst
Expand Down
3 changes: 3 additions & 0 deletions doc/fluid/api_cn/nn_cn/activation_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,11 @@ activation
activation_cn/ELU_cn.rst
activation_cn/GELU_cn.rst
activation_cn/Hardshrink_cn.rst
activation_cn/Hardtanh_cn.rst
activation_cn/PRelu_cn.rst
activation_cn/ReLU_cn.rst
activation_cn/LeakyReLU_cn.rst
activation_cn/Softmax_cn.rst
activation_cn/LogSoftmax_cn.rst
activation_cn/Sigmoid_cn.rst
activation_cn/LogSigmoid_cn.rst
2 changes: 1 addition & 1 deletion doc/fluid/api_cn/nn_cn/activation_cn/ELU_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ ELU

ELU激活层(ELU Activation Operator)

根据 `Exponential Linear Units <https://arxiv.org/abs/1511.07289>` 对输入Tensor中每个元素应用以下计算。
根据 `Exponential Linear Units <https://arxiv.org/abs/1511.07289>`_ 对输入Tensor中每个元素应用以下计算。

.. math::

Expand Down
2 changes: 1 addition & 1 deletion doc/fluid/api_cn/nn_cn/activation_cn/GELU_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ GELU

GELU激活层(GELU Activation Operator)

更多细节请参考 `Gaussian Error Linear Units <https://arxiv.org/abs/1606.08415>`。
逐元素计算 GELU激活函数。更多细节请参考 `Gaussian Error Linear Units <https://arxiv.org/abs/1606.08415>`_

如果使用近似计算:

Expand Down
45 changes: 45 additions & 0 deletions doc/fluid/api_cn/nn_cn/activation_cn/Hardtanh_cn.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
.. _cn_api_nn_Hardtanh:

Hardtanh
-------------------------------
.. py:class:: paddle.nn.Hardtanh(min=-1.0, max=1.0, name=None)

Hardtanh激活层(Hardtanh Activation Operator)。计算公式如下:

.. math::

Hardtanh(x)=
\left\{
\begin{aligned}
&max, & & if \ x > max \\
&min, & & if \ x < min \\
&x, & & if \ others
\end{aligned}
\right.

其中,:math:`x` 为输入的 Tensor

参数
::::::::::
- min (float, 可选) - Hardtanh激活计算公式中的min值。默认值为-1。
- max (float, 可选) - Hardtanh激活计算公式中的max值。默认值为1。
- name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。

形状:
::::::::::
- input: 任意形状的Tensor。
- output: 和input具有相同形状的Tensor。

代码示例
:::::::::

.. code-block:: python

import paddle
import numpy as np

paddle.disable_static()

x = paddle.to_tensor(np.array([-1.5, 0.3, 2.5]))
m = paddle.nn.Hardtanh()
out = m(x) # # [-1., 0.3, 1.]
6 changes: 3 additions & 3 deletions doc/fluid/api_cn/nn_cn/activation_cn/LogSigmoid_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@ LogSigmoid
-------------------------------
.. py:class:: paddle.nn.LogSigmoid(name=None)

Logsigmoid激活层。计算公式如下:
LogSigmoid激活层。计算公式如下:

.. math::

Logsigmoid(x) = \log \frac{1}{1 + e^{-x}}
LogSigmoid(x) = \log \frac{1}{1 + e^{-x}}

其中,:math:`x` 为输入的 Tensor

Expand All @@ -33,4 +33,4 @@ Logsigmoid激活层。计算公式如下:

x = paddle.to_tensor(np.array([1.0, 2.0, 3.0, 4.0]))
m = paddle.nn.LogSigmoid()
out = m(x) # [0.7310586, 0.880797, 0.95257413, 0.98201376]
out = m(x) # [-0.313262 -0.126928 -0.0485874 -0.0181499]
53 changes: 53 additions & 0 deletions doc/fluid/api_cn/nn_cn/activation_cn/PRelu_cn.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
.. _cn_api_nn_PRelu:

PRelu
-------------------------------
.. py:class:: paddle.nn.PRelu(num_parameters=1, init=0.25, weight_attr=None, name=None)

PRelu激活层(PRelu Activation Operator)。计算公式如下:

如果使用近似计算:

.. math::

PReLU(x) = max(0, x) + weight * min(0, x)

其中,:math:`x` 为输入的 Tensor

参数
::::::::::
- num_parameters (int, 可选) - 可训练`weight`数量,支持2种输入:1 - 输入中的所有元素使用同一个`weight`值; 输入的通道数 - 在同一个通道中的元素使用同一个`weight`值。默认为1。
- init (float, 可选) - `weight`的初始值。默认为0.25。
- weight_attr (ParamAttr, 可选) - 指定权重参数属性的对象。默认值为None,表示使用默认的权重参数属性。具体用法请参见 :ref:`cn_api_fluid_ParamAttr` 。
- name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。

形状:
::::::::::
- input: 任意形状的Tensor。
- output: 和input具有相同形状的Tensor。

代码示例
:::::::::

.. code-block:: python

import paddle
import numpy as np

paddle.disable_static()

data = np.array([[[[-2.0, 3.0, -4.0, 5.0],
[ 3.0, -4.0, 5.0, -6.0],
[-7.0, -8.0, 8.0, 9.0]],
[[ 1.0, -2.0, -3.0, 4.0],
[-5.0, 6.0, 7.0, -8.0],
[ 6.0, 7.0, 8.0, 9.0]]]], 'float32')
x = paddle.to_tensor(data)
m = paddle.nn.PReLU(1, 0.25)
out = m(x)
# [[[[-0.5 , 3. , -1. , 5. ],
# [ 3. , -1. , 5. , -1.5 ],
# [-1.75, -2. , 8. , 9. ]],
# [[ 1. , -0.5 , -0.75, 4. ],
# [-1.25, 6. , 7. , -2. ],
# [ 6. , 7. , 8. , 9. ]]]]
Loading