Skip to content

Commit 1311089

Browse files
authored
hardtanh prelu softmax, test=develop (#2425)
1 parent 3354d55 commit 1311089

29 files changed

+418
-22
lines changed

doc/fluid/api/nn.rst

+3
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,7 @@ paddle.nn
6262
nn/grid_sampler.rst
6363
nn/GroupNorm.rst
6464
nn/hardshrink.rst
65+
nn/hardtanh.rst
6566
nn/hard_sigmoid.rst
6667
nn/hard_swish.rst
6768
nn/hash.rst
@@ -104,12 +105,14 @@ paddle.nn
104105
nn/polynomial_decay.rst
105106
nn/Pool2D.rst
106107
nn/pool3d.rst
108+
nn/prelu.rst
107109
nn/prior_box.rst
108110
nn/prroi_pool.rst
109111
nn/psroi_pool.rst
110112
nn/random_crop.rst
111113
nn/rank_loss.rst
112114
nn/ReLU.rst
115+
nn/relu.rst
113116
nn/relu6.rst
114117
nn/resize_bilinear.rst
115118
nn/resize_nearest.rst

doc/fluid/api/nn/activation.rst

+3
Original file line numberDiff line numberDiff line change
@@ -8,5 +8,8 @@ activation
88
activation/ELU.rst
99
activation/GELU.rst
1010
activation/Hardshrink.rst
11+
activation/Hardtanh.rst
12+
activation/PReLU.rst
1113
activation/ReLU.rst
1214
activation/LogSigmoid.rst
15+
activation/Softmax.rst

doc/fluid/api/nn/activation/ELU.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_activation_ELU:
2+
3+
ELU
4+
-------------------------------
5+
6+
.. autoclass:: paddle.nn.ELU
7+
:noindex:

doc/fluid/api/nn/activation/GELU.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_activation_GELU:
2+
3+
GELU
4+
-------------------------------
5+
6+
.. autoclass:: paddle.nn.GELU
7+
:noindex:
+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_activation_Hardtanh:
2+
3+
Hardtanh
4+
-------------------------------
5+
6+
.. autoclass:: paddle.nn.Hardtanh
7+
:noindex:
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_activation_LogSigmoid:
2+
3+
LogSigmoid
4+
-------------------------------
5+
6+
.. autoclass:: paddle.nn.LogSigmoid
7+
:noindex:

doc/fluid/api/nn/activation/PReLU.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_activation_PReLU:
2+
3+
PReLU
4+
-------------------------------
5+
6+
.. autoclass:: paddle.nn.PReLU
7+
:noindex:

doc/fluid/api/nn/activation/ReLU.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_activation_ReLU:
2+
3+
ReLU
4+
-------------------------------
5+
6+
.. autoclass:: paddle.nn.ReLU
7+
:noindex:
+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_activation_Softmax:
2+
3+
Softmax
4+
-------------------------------
5+
6+
.. autoclass:: paddle.nn.Softmax
7+
:noindex:

doc/fluid/api/nn/elu.rst

+3-1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,8 @@
22

33
elu
44
-------------------------------
5-
:doc_source: paddle.fluid.layers.elu
5+
6+
.. autofunction:: paddle.nn.functional.elu
7+
:noindex:
68

79

doc/fluid/api/nn/gelu.rst

+3-1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,8 @@
22

33
gelu
44
-------------------------------
5-
:doc_source: paddle.fluid.layers.gelu
5+
6+
.. autofunction:: paddle.nn.functional.gelu
7+
:noindex:
68

79

doc/fluid/api/nn/hardtanh.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_hardtanh:
2+
3+
hardtanh
4+
-------------------------------
5+
6+
.. autofunction:: paddle.nn.functional.hardtanh
7+
:noindex:

doc/fluid/api/nn/logsigmoid.rst

+3-1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,8 @@
22

33
logsigmoid
44
-------------------------------
5-
:doc_source: paddle.fluid.layers.logsigmoid
5+
6+
.. autofunction:: paddle.nn.functional.logsigmoid
7+
:noindex:
68

79

doc/fluid/api/nn/prelu.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_prelu:
2+
3+
prelu
4+
-------------------------------
5+
6+
.. autofunction:: paddle.nn.functional.prelu
7+
:noindex:

doc/fluid/api/nn/relu.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.. _api_nn_relu:
2+
3+
relu
4+
-------------------------------
5+
6+
.. autofunction:: paddle.nn.functional.relu
7+
:noindex:

doc/fluid/api/nn/softmax.rst

+1-4
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,7 @@
1-
.. THIS FILE IS GENERATED BY `gen_doc.{py|sh}`
2-
!DO NOT EDIT THIS FILE MANUALLY!
3-
41
.. _api_nn_softmax:
52

63
softmax
7-
-------
4+
-------------------------------
85

96
.. autofunction:: paddle.nn.functional.softmax
107
:noindex:

doc/fluid/api_cn/nn_cn.rst

+2
Original file line numberDiff line numberDiff line change
@@ -75,6 +75,7 @@ paddle.nn
7575
nn_cn/grid_sampler_cn.rst
7676
nn_cn/GroupNorm_cn.rst
7777
nn_cn/hardshrink_cn.rst
78+
nn_cn/hardtanh_cn.rst
7879
nn_cn/hard_sigmoid_cn.rst
7980
nn_cn/hard_swish_cn.rst
8081
nn_cn/hash_cn.rst
@@ -117,6 +118,7 @@ paddle.nn
117118
nn_cn/pool2d_cn.rst
118119
nn_cn/Pool2D_cn.rst
119120
nn_cn/pool3d_cn.rst
121+
nn_cn/prelu_cn.rst
120122
nn_cn/prior_box_cn.rst
121123
nn_cn/prroi_pool_cn.rst
122124
nn_cn/psroi_pool_cn.rst

doc/fluid/api_cn/nn_cn/activation_cn.rst

+3
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,11 @@ activation
1111
activation_cn/ELU_cn.rst
1212
activation_cn/GELU_cn.rst
1313
activation_cn/Hardshrink_cn.rst
14+
activation_cn/Hardtanh_cn.rst
15+
activation_cn/PRelu_cn.rst
1416
activation_cn/ReLU_cn.rst
1517
activation_cn/LeakyReLU_cn.rst
18+
activation_cn/Softmax_cn.rst
1619
activation_cn/LogSoftmax_cn.rst
1720
activation_cn/Sigmoid_cn.rst
1821
activation_cn/LogSigmoid_cn.rst

doc/fluid/api_cn/nn_cn/activation_cn/ELU_cn.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ELU
66
77
ELU激活层(ELU Activation Operator)
88

9-
根据 `Exponential Linear Units <https://arxiv.org/abs/1511.07289>` 对输入Tensor中每个元素应用以下计算。
9+
根据 `Exponential Linear Units <https://arxiv.org/abs/1511.07289>`_ 对输入Tensor中每个元素应用以下计算。
1010

1111
.. math::
1212

doc/fluid/api_cn/nn_cn/activation_cn/GELU_cn.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ GELU
66
77
GELU激活层(GELU Activation Operator)
88

9-
更多细节请参考 `Gaussian Error Linear Units <https://arxiv.org/abs/1606.08415>`。
9+
逐元素计算 GELU激活函数。更多细节请参考 `Gaussian Error Linear Units <https://arxiv.org/abs/1606.08415>`_
1010

1111
如果使用近似计算:
1212

Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
.. _cn_api_nn_Hardtanh:
2+
3+
Hardtanh
4+
-------------------------------
5+
.. py:class:: paddle.nn.Hardtanh(min=-1.0, max=1.0, name=None)
6+
7+
Hardtanh激活层(Hardtanh Activation Operator)。计算公式如下:
8+
9+
.. math::
10+
11+
Hardtanh(x)=
12+
\left\{
13+
\begin{aligned}
14+
&max, & & if \ x > max \\
15+
&min, & & if \ x < min \\
16+
&x, & & if \ others
17+
\end{aligned}
18+
\right.
19+
20+
其中,:math:`x` 为输入的 Tensor
21+
22+
参数
23+
::::::::::
24+
- min (float, 可选) - Hardtanh激活计算公式中的min值。默认值为-1。
25+
- max (float, 可选) - Hardtanh激活计算公式中的max值。默认值为1。
26+
- name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。
27+
28+
形状:
29+
::::::::::
30+
- input: 任意形状的Tensor。
31+
- output: 和input具有相同形状的Tensor。
32+
33+
代码示例
34+
:::::::::
35+
36+
.. code-block:: python
37+
38+
import paddle
39+
import numpy as np
40+
41+
paddle.disable_static()
42+
43+
x = paddle.to_tensor(np.array([-1.5, 0.3, 2.5]))
44+
m = paddle.nn.Hardtanh()
45+
out = m(x) # # [-1., 0.3, 1.]

doc/fluid/api_cn/nn_cn/activation_cn/LogSigmoid_cn.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ LogSigmoid
44
-------------------------------
55
.. py:class:: paddle.nn.LogSigmoid(name=None)
66
7-
Logsigmoid激活层。计算公式如下:
7+
LogSigmoid激活层。计算公式如下:
88

99
.. math::
1010
11-
Logsigmoid(x) = \log \frac{1}{1 + e^{-x}}
11+
LogSigmoid(x) = \log \frac{1}{1 + e^{-x}}
1212
1313
其中,:math:`x` 为输入的 Tensor
1414

@@ -33,4 +33,4 @@ Logsigmoid激活层。计算公式如下:
3333
3434
x = paddle.to_tensor(np.array([1.0, 2.0, 3.0, 4.0]))
3535
m = paddle.nn.LogSigmoid()
36-
out = m(x) # [0.7310586, 0.880797, 0.95257413, 0.98201376]
36+
out = m(x) # [-0.313262 -0.126928 -0.0485874 -0.0181499]
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
.. _cn_api_nn_PRelu:
2+
3+
PRelu
4+
-------------------------------
5+
.. py:class:: paddle.nn.PRelu(num_parameters=1, init=0.25, weight_attr=None, name=None)
6+
7+
PRelu激活层(PRelu Activation Operator)。计算公式如下:
8+
9+
如果使用近似计算:
10+
11+
.. math::
12+
13+
PReLU(x) = max(0, x) + weight * min(0, x)
14+
15+
其中,:math:`x` 为输入的 Tensor
16+
17+
参数
18+
::::::::::
19+
- num_parameters (int, 可选) - 可训练`weight`数量,支持2种输入:1 - 输入中的所有元素使用同一个`weight`值; 输入的通道数 - 在同一个通道中的元素使用同一个`weight`值。默认为1。
20+
- init (float, 可选) - `weight`的初始值。默认为0.25。
21+
- weight_attr (ParamAttr, 可选) - 指定权重参数属性的对象。默认值为None,表示使用默认的权重参数属性。具体用法请参见 :ref:`cn_api_fluid_ParamAttr` 。
22+
- name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。
23+
24+
形状:
25+
::::::::::
26+
- input: 任意形状的Tensor。
27+
- output: 和input具有相同形状的Tensor。
28+
29+
代码示例
30+
:::::::::
31+
32+
.. code-block:: python
33+
34+
import paddle
35+
import numpy as np
36+
37+
paddle.disable_static()
38+
39+
data = np.array([[[[-2.0, 3.0, -4.0, 5.0],
40+
[ 3.0, -4.0, 5.0, -6.0],
41+
[-7.0, -8.0, 8.0, 9.0]],
42+
[[ 1.0, -2.0, -3.0, 4.0],
43+
[-5.0, 6.0, 7.0, -8.0],
44+
[ 6.0, 7.0, 8.0, 9.0]]]], 'float32')
45+
x = paddle.to_tensor(data)
46+
m = paddle.nn.PReLU(1, 0.25)
47+
out = m(x)
48+
# [[[[-0.5 , 3. , -1. , 5. ],
49+
# [ 3. , -1. , 5. , -1.5 ],
50+
# [-1.75, -2. , 8. , 9. ]],
51+
# [[ 1. , -0.5 , -0.75, 4. ],
52+
# [-1.25, 6. , 7. , -2. ],
53+
# [ 6. , 7. , 8. , 9. ]]]]

0 commit comments

Comments
 (0)