![](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/static/images/covers-random/194.png)
本文给出了pytorch里面支持的所有二十几种激活函数的可视化作图。
文章目录
这里罗列了pytorch里面支持的所有二十几种激活函数,对每个激活函数及其梯度函数都作了图像,可以让大家更直观的了解激活函数的特性。
Sigmoid
$$
\text{Sigmoid}(x) = \sigma(x) = \frac{1}{1 + e^{-x}}
$$
![Sigmoid](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/6b85d69245fd11a1/2022/2/sigmoid.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=a4/9IIRJWoLxvKy708AuD/r6i2o%3D)
LogSigmoid
$$
\text{LogSigmoid}(x) = \log\left(\frac{ 1 }{ 1 + e^{-x}}\right)
$$
![LogSigmoid](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/2a73a3706cd75323/2022/2/logsigmoid.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=MfzLZPO/lNjT4ErAoqgrlicmBSw%3D)
SoftSign
$$
\text{SoftSign}(x) = \frac{x}{ 1 + |x|}
$$
![SoftSign](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/d715e283675ea86/2022/2/softsign.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=qxzxsQRrJ9%2BUUB6ETcoOCrmvDwk%3D)
HardSigmoid
$$
\text{Hardsigmoid}(x) = \begin{cases}
0 & \text{if~} x \le -3, \\
1 & \text{if~} x \ge +3, \\
x / 6 + 1 / 2 & \text{otherwise}
\end{cases}
$$
![HardSigmoid](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/96fdbb58793292c/2022/2/hard_sigmoid.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=Uw9oZcYUPPGx7ClHms3b3OL9V9E%3D)
Tanh
$$
\text{Tanh}(x) = \tanh(x) = \frac{e^{x} - e^{-x}} {e^{x} + e^{-x}}
$$
![Tanh](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/3c666d687ae570cc/2022/2/tanh.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=czcWtsjSkxPGiOcej3gVRakX2MA%3D)
HardTanh
![HardTanh](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/9d8a6152ef64bedc/2022/2/hard_tanh.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=183sC74cuM/iH3RRE1BjPmbJMQ8%3D)
ReLU
$$
\text{ReLU}(x) = \max(0, x)
$$
![ReLU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/67b4646165c83f8/2022/2/relu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=eKnG8sP26ACub0ik4zkoFRr3dxE%3D)
ReLU6
$$
\text{ReLU6}(x) = \min(\max(0,x), 6)
$$
![ReLU6](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/70c33340e174183/2022/2/relu6.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=e1WUiXg9MGW4u5Yb1U5MhNNwu8o%3D)
PReLU
$$
f(x)= \begin{cases} x, &x \gt 0\\ ax, & x \le 0 \end{cases}
$$
$PReLU$的参数$a$在训练中学习得到。
![PReLU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/b1bbef0e8faa4b5/2022/2/prelu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=cTIh2FWxawbR8L32q97VSqdmJQc%3D)
Leaky ReLU
$$
\text{LeakyRELU}(x) =
\begin{cases}
x, & \text{ if } x \geq 0 \\
\alpha x, & \text{ otherwise }
\end{cases}
$$
![LeakyReLU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/cc61c4f2b3921b41/2022/2/leakyrelu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=exULQnOaXWTC%2Bl8GyKbuvBGF1YQ%3D)
RReLU
$$
\text{RReLU}(x) =
\begin{cases}
x & \text{if } x \geq 0 \\
ax & \text{ otherwise }, a \sim \mathcal{U}(\text{lower}, \text{upper})
\end{cases}
$$
这里的参数$a$服从$(lower, upper)$间的均匀分布。
![RReLU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/01f398a677be498/2022/2/rrelu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=GjHTAIfysf1Tu211taPEdTQY35Q%3D)
ELU
$$
f(x)= \begin{cases} x, &x \gt 0\\ \alpha(e^x - 1), & x \le 0 \end{cases}
$$
![ELU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/d8e2e4d4417d1ca9/2022/2/elu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=jrFII6Q3UpGeuGMPtoDSLKVPWYU%3D)
CELU
$$
f(x)= \begin{cases} x, &x \gt 0\\ \alpha(e^{\frac{x}{\alpha}} - 1), & x \le 0 \end{cases}
$$
![CELU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/dc14d5b7ecaed91b/2022/2/celudouble.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=s1W9CbJn4xV7cHYZoUrcakZO1N8%3D)
SELU
$$
\text{SELU}(x) = \text{scale} * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)))
$$
![SELU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/6b3277407854c5fe/2022/2/selu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=xdHvnA4xh79z6bbZqfUuzFVabv4%3D)
GeLU
$$
\text{GELU}(x) = x * \Phi(x), \Phi(x) \text{为高斯分布的累积分布函数}
$$
![GeLU](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/fb99e0e7325df8bb/2022/2/gelu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=G0FTjYq6wYql8d4LbmAorVSJNA4%3D)
SoftPlus
$$
\text{Softplus}(x) = \frac{1}{\beta} *
\log(1 + e^{\beta * x})
$$
![SoftPlus](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/5faa5ffc26b9b1/2022/2/softplus.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=arwVvoE%2BzeLhblDQM3HcJQuSeFQ%3D)
Mish
$$
\text{Mish}(x) = x * \text{Tanh}(\text{Softplus}(x))
$$
![Mish](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/cf969c29ab11ffdf/2022/2/mish.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=2X/VeqH8suDt8a4I2speUhqBHyY%3D)
Silu
$$
\text{silu}(x) = x * \sigma(x), \text{where } \sigma(x) \text{ 为 sigmoid.}
$$
![Silu](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/e67ad892511bf6a4/2022/2/silu.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=k2RnNvbHNqctZw0hSPP4W9Emqvw%3D)
HardSwish
$$
\text{Hardswish}(x) = \begin{cases}
0 & \text{if~} x \le -3, \\
x & \text{if~} x \ge +3, \\
x \cdot (x + 3) /6 & \text{otherwise}
\end{cases}
$$
![HardSwish](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/749fef201c3536ba/2022/2/hard_swish.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=aSKm9aeGlryA76p2jyk987vB0kM%3D)
TanhShrink
$$
\text{Tanhshrink}(x) = x - \tanh(x)
$$
![TanhShrink](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/1033953c82f9ff91/2022/2/tanhshrink.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=y3Zpwanh1Jlqx0Hnen6QKIPP%2BpM%3D)
SoftShrink
$$
\text{SoftShrinkage}(x) =
\begin{cases}
x - \lambda, & \text{ if } x \gt \lambda \\
x + \lambda, & \text{ if } x \lt -\lambda \\
0, & \text{ otherwise }
\end{cases}
$$
![SoftShrink](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/a853671b86bf5a57/2022/2/softshrink.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=TkWEwlT0n93cNoNuatx0DfiGaww%3D)
HardShrink
$$
\text{HardShrink}(x) = \begin{cases}
x, & \text{ if } x \gt \lambda \\
x, & \text{ if } x \lt -\lambda \\
0, & \text{ otherwise }
\end{cases}
$$
![HardShrink](https://zwlw-static.oss-cn-shanghai.aliyuncs.com/media/articles/user3_images/d447edbb7b7eac78/2022/2/hardshrink.png?OSSAccessKeyId=LTAI5t5cqAaT93BT832yXzMS&Expires=1960562140&Signature=xk56/6fuRgXod7Cvz5yRdbA7dTo%3D)
更多推荐