示例1
import torch
x = torch.randn(3, 2)
y = torch.sigmoid(x)
print(x)
示例2
import torch
import torch.nn as nn
activate = nn.Sigmoid()
x = torch.randn(3, 2)
y = activate(x)
print(y)
非线性激活(加权求和,非线性)
激活函数 | 说明 |
nn.ELU | Applies the Exponential Linear Unit (ELU) function, element-wise. |
nn.Hardshrink | Applies the Hard Shrinkage (Hardshrink) function element-wise. |
nn.Hardsigmoid | Applies the Hardsigmoid function element-wise. |
nn.Hardtanh | Applies the HardTanh function element-wise. |
nn.Hardswish | Applies the Hardswish function, element-wise. |
nn.LeakyReLU | Applies the LeakyReLU function element-wise. |
nn.LogSigmoid | Applies the Logsigmoid function element-wise. |
nn.MultiheadAttention | Allows the model to jointly attend to information from different representation subspaces. |
nn.PReLU | Applies the element-wise PReLU function. |
nn.ReLU | Applies the rectified linear unit function element-wise. |
nn.ReLU6 | Applies the ReLU6 function element-wise. |
nn.RReLU | Applies the randomized leaky rectified linear unit function, element-wise. |
nn.SELU | Applies the SELU function element-wise. |
nn.CELU | Applies the CELU function element-wise. |
nn.GELU | Applies the Gaussian Error Linear Units function. |
nn.Sigmoid | Applies the Sigmoid function element-wise. |
nn.SiLU | Applies the Sigmoid Linear Unit (SiLU) function, element-wise. |
nn.Mish | Applies the Mish function, element-wise. |
nn.Softplus | Applies the Softplus function element-wise. |
nn.Softshrink | Applies the soft shrinkage function element-wise. |
nn.Softsign | Applies the element-wise Softsign function. |
nn.Tanh | Applies the Hyperbolic Tangent (Tanh) function element-wise. |
nn.Tanhshrink | Applies the element-wise Tanhshrink function. |
nn.Threshold | Thresholds each element of the input Tensor. |
nn.GLU | Applies the gated linear unit function. |
非线性激活(其他)
激活函数 | 说明 |
nn.Softmin | Applies the Softmin function to an n-dimensional input Tensor. |
nn.Softmax | Applies the Softmax function to an n-dimensional input Tensor. |
nn.Softmax2d | Applies SoftMax over features to each spatial location. |
nn.LogSoftmax | Applies the log(Softmax(x))log(Softmax(//x//)) function to an n-dimensional input Tensor. |
nn.AdaptiveLogSoftmaxWithLoss | Efficient softmax approximation. |