PyTorch中的激活函数简介

创建日期:2025-03-12
更新日期:2025-03-12

示例1

import torch

x = torch.randn(3, 2)
y = torch.sigmoid(x)
print(x)

示例2

import torch
import torch.nn as nn

activate = nn.Sigmoid()

x = torch.randn(3, 2)
y = activate(x)
print(y)

非线性激活(加权求和,非线性)

激活函数说明
nn.ELUApplies the Exponential Linear Unit (ELU) function, element-wise.
nn.HardshrinkApplies the Hard Shrinkage (Hardshrink) function element-wise.
nn.HardsigmoidApplies the Hardsigmoid function element-wise.
nn.HardtanhApplies the HardTanh function element-wise.
nn.HardswishApplies the Hardswish function, element-wise.
nn.LeakyReLUApplies the LeakyReLU function element-wise.
nn.LogSigmoidApplies the Logsigmoid function element-wise.
nn.MultiheadAttentionAllows the model to jointly attend to information from different representation subspaces.
nn.PReLUApplies the element-wise PReLU function.
nn.ReLUApplies the rectified linear unit function element-wise.
nn.ReLU6Applies the ReLU6 function element-wise.
nn.RReLUApplies the randomized leaky rectified linear unit function, element-wise.
nn.SELUApplies the SELU function element-wise.
nn.CELUApplies the CELU function element-wise.
nn.GELUApplies the Gaussian Error Linear Units function.
nn.SigmoidApplies the Sigmoid function element-wise.
nn.SiLUApplies the Sigmoid Linear Unit (SiLU) function, element-wise.
nn.MishApplies the Mish function, element-wise.
nn.SoftplusApplies the Softplus function element-wise.
nn.SoftshrinkApplies the soft shrinkage function element-wise.
nn.SoftsignApplies the element-wise Softsign function.
nn.TanhApplies the Hyperbolic Tangent (Tanh) function element-wise.
nn.TanhshrinkApplies the element-wise Tanhshrink function.
nn.ThresholdThresholds each element of the input Tensor.
nn.GLUApplies the gated linear unit function.

非线性激活(其他)

激活函数说明
nn.SoftminApplies the Softmin function to an n-dimensional input Tensor.
nn.SoftmaxApplies the Softmax function to an n-dimensional input Tensor.
nn.Softmax2dApplies SoftMax over features to each spatial location.
nn.LogSoftmaxApplies the log⁡(Softmax(x))log(Softmax(//x//)) function to an n-dimensional input Tensor.
nn.AdaptiveLogSoftmaxWithLossEfficient softmax approximation.