torch.nn.functional.selu¶ torch.nn.functional.selu(input, inplace=False) → Tensor[原始碼][原始碼]¶ 逐元素地套用 SELU(x)=scale∗(max(0,x)+min(0,α∗(exp(x)−1)))\text{SELU}(x) = scale * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)))SELU(x)=scale∗(max(0,x)+min(0,α∗(exp(x)−1))), 且 α=1.6732632423543772848170429916717\alpha=1.6732632423543772848170429916717α=1.6732632423543772848170429916717 以及 scale=1.0507009873554804934193349852946scale=1.0507009873554804934193349852946scale=1.0507009873554804934193349852946。 詳見 SELU 以取得更多詳細資訊。 回傳類型 Tensor