EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 600+ Courses All in One Bundle
  • Login
Home Software Development Software Development Tutorials PyTorch Tutorial PyTorch Activation Function
Secondary Sidebar
PyTorch Tutorial
  • PyTorch
    • PyTorch Image Classification
    • PyTorch Random
    • PyTorch Variable
    • PyTorch Activation Function
    • Python Formatted String
    • PyTorch GPU
    • PyTorch CUDA
    • PyTorch DataLoader
    • PyTorch LSTM
    • PyTorch Pad
    • PyTorch OpenCL
    • PyTorch Lightning
    • PyTorch SoftMax
    • PyTorch Flatten
    • PyTorch gan
    • PyTorch max
    • PyTorch pip
    • PyTorch Parameter
    • PyTorch Load Model
    • PyTorch Distributed
    • PyTorch BERT
    • PyTorch interpolate
    • PyTorch JIT
    • PyTorch expand
    • PyTorch AMD
    • PyTorch GRU
    • PyTorch rnn
    • PyTorch permute
    • PyTorch argmax
    • PyTorch SGD
    • PyTorch nn
    • PyTorch One Hot Encoding
    • PyTorch Tensors
    • What is PyTorch?
    • PyTorch MSELoss()
    • PyTorch NLLLOSS
    • PyTorch MaxPool2d
    • PyTorch Pretrained Models
    • PyTorch Squeeze
    • PyTorch Reinforcement Learning
    • PyTorch zero_grad
    • PyTorch norm
    • PyTorch VAE
    • PyTorch Early Stopping
    • PyTorch requires_grad
    • PyTorch MNIST
    • PyTorch Conv2d
    • Dataset Pytorch
    • PyTorch tanh
    • PyTorch bmm
    • PyTorch profiler
    • PyTorch unsqueeze
    • PyTorch adam
    • PyTorch backward
    • PyTorch concatenate
    • PyTorch Embedding
    • PyTorch Tensor to NumPy
    • PyTorch Normalize
    • PyTorch ReLU
    • PyTorch Autograd
    • PyTorch Transpose
    • PyTorch Object Detection
    • PyTorch Autoencoder
    • PyTorch Loss
    • PyTorch repeat
    • PyTorch gather
    • PyTorch sequential
    • PyTorch U-NET
    • PyTorch Sigmoid
    • PyTorch Neural Network
    • PyTorch Quantization
    • PyTorch Ignite
    • PyTorch Versions
    • PyTorch TensorBoard
    • PyTorch Dropout
    • PyTorch Model
    • PyTorch optimizer
    • PyTorch ResNet
    • PyTorch CNN
    • PyTorch Detach
    • Single Layer Perceptron
    • PyTorch vs Keras
    • torch.nn Module

PyTorch Activation Function

PyTorch Activation Function

Introduction to PyTorch Activation Function

The following article provides an outline for PyTorch Activation Function. The balanced sum of the inputs is subjected to an activation function. An activation function’s job is to bring non-linearity into the Neural Network’s decision bounding box. Activation functions accept any integer as an input and convert it to an outcome. We can employ nonlinear functions to achieve this goal because any function can be used as an activation function.

What is the PyTorch activation function?

The activation functions can be deployed to a network layer in a variety of ways:

  1. The activation function layer—these are classes that can be utilized as activation functions—can be used.
  2. Activation functions are defined as functions that can be employed as activation functions.

Using the PyTorch activation function

Any activation function, including f(x)=x, the linear or identity function, can be employed. When constructing and implementing an activation function, the computational efficiency of the function is the most important factor to consider. Without the activation function, the network is a stack linear regression model. Therefore, in each of the layers, we use an activation function.

Types PyTorch activation function

The most widely used activation functions are included in the Pytorch library. Modules (or classes) and definitions have been provided for these activation functions. Some of the most regularly used activation functions and their corresponding class and function names are included here. The most popular function is

Start Your Free Software Development Course

Web development, programming languages, Software testing & others

All in One Software Development Bundle(600+ Courses, 50+ projects)
Python TutorialC SharpJavaJavaScript
C Plus PlusSoftware TestingSQLKali Linux
Price
View Courses
600+ Online Courses | 50+ projects | 3000+ Hours | Verifiable Certificates | Lifetime Access
4.6 (86,560 ratings)
  1. Step Function
  2. Sigmoid
  3. Tanh
  4. RELU
  5. Leaky RELU
  6. SoftMax

Rectified Linear Unit, Sigmoid, and Tanh are three activation functions that play a key role in the operation of neural networks. ReLU, on the other hand, has mostly withstood the test of time and generalizes extremely well over a wide range of deep learning applications. For example, we can use one of these in classic PyTorch:

Add the nn.Sigmoid(), nn.Tanh(), or nn.ReLU() activation directly functions to the neural network, for example, in nn. Sequential. The Tanh and Sigmoid activation functions are the earliest based on neural network significance. Tanh translates all inputs into the (-1.0, 1.0) range, with the highest slope around x = 0. The graph beneath shows that Tanh transforms all inputs into the (-1.0, 1.0) spectrum. On the other hand, the Sigmoid converts all inputs to a range of (0.0, 1.0), with the steepest slope around x = 0. ReLU is distinct. If x=0.0, this function sets all inputs to 0.0. The input is assigned to x in all the other circumstances.

Sigmoid:

The PyTorch sigmoid function deforms any real integer into a value between 0 and 1 via an element-wise approach. This is a typical activation function in a classification algorithm (including logistic regression), whose outputs are true, i.e., p (y == 1).

tanh:

Pytorch tanh is divided based on its output, i.e., between -1 and 1. The hyperbolic tangent feature may be differentiated at every point, and its derivative is 1 – tanh2(x). Because the expression uses the tanh function, that function’s return can be utilized to speed up reverse propagation. Tanh is similar to Sigmoid except that it is centered and ranges from -1 to 1.

RELU:

The activation function nn.ReLU is used to make the system non-linear and fits complex information. For example, Relu is an activation function that is given like this:

relu(x) = { 0 if x<0, x if x > 0}.

To the forward pass, add the logical representations of these activation functions.

Below is

import torch.nn.functional as F
def __init__(self):
super().__init__()
self.layers = nn. Sequential(
nn.Linear(27 * 27, 255),
nn.Sigmoid(),
nn.Linear(255, 127),
nn.Tanh(),
nn.Linear(127, 10),
nn.ReLU()
)
def forward (self, a):
a = F.sigmoid(self.lin1(a)
a = F.tanh(self.lin2(a))
a = F.relu(self.lin3(a))
return a

Importing PyTorch activation function

Importing the Activation Function Definition:

The torch.nn.functional module contains the activation function levels. F is the most popular import name for this module.

import torch.nn as nn
import torch.nn.functional as F

Example of PyTorch Activation Function

Let’s see different types of Activation layers with examples

Example-1

Using Sigmoid

import torch
torch.manual_seed(1)
a = torch.randn((2, 2, 2))
b = torch.sigmoid(a)
b.min(), b.max()

Explanation

The output of this snippet shows how the sigmoid function is used, and the torch-generated value is given as:

Output

PyTorch Activation Function output 1

Example-2

import torch
import torch.nn as nn
import torch.nn.functional as F
a = torch.tensor([-1.1, 1.1, 3.0, 4.0])
out = torch.softmax(x, dim=0)
print(out)
s1 = nn.Softmax(dim=0)
out = s1(a)
print(out)
out = torch.sigmoid(a)
print(output)
s2 = nn.Sigmoid()
output = s(a)
print(out)
out = torch.tanh(a)
print(out)
t = nn.Tanh()
out = t(a)
print(out)
out = torch.relu(a)
print(out)
relu = nn.ReLU()
out = relu(a)
print(out)
out = F.leaky_relu(a)
print(out)
lrelu = nn.LeakyReLU()
out = lrelu(a)
print(out)
class NeuralNet(nn.Module):
def __init__(self, input_size, hidden_size):
super(NeuralNet, self).__init__()
self.linear1 = nn.Linear(input_size, hidden_size)
self.relu = nn.ReLU()
self.linear2 = nn.Linear(hidden_size, 1)
self.sigmoid = nn.Sigmoid()
def forward(self, a):
out1 = self.linear1(a)
out1 = self.relu(out1)
out1 = self.linear2(out1)
out1 = self.sigmoid(out1)
class NeuralNet(nn.Module):
def __init__(self, input_size, hidden_size):
super(NeuralNet, self).__init__()
self.linear1 = nn.Linear(input_size, hidden_size)
self.linear2 = nn.Linear(hidden_size, 1)
def forward(self, a):
out1 = torch.relu(self.linear1(a))
out1 = torch.sigmoid(self.linear2(out1))
return out1

Output

PyTorch Activation Function output 2

Example-3

Using tanh Activation Function

import torch
import numpy as np
import matplotlib.pyplot as plt
m = np.linspace ( - 4 , 4 , 13 )
n= torch.tanh (torch.FloatTensor (m))
print (n)
plt. plot (m, n.numpy (), color = 'green' , marker = "o" )
plt.title ( "torch.tanh" )
plt.xlabel ( "X" )
plt.ylabel ( "Y" )
plt.show ()
print (n)
tensor ([-0.9993, -0.9975, -0.9904, -0.9640, -0.8701, -0.5828, 0.0000, 0.5828,   0.8701,  0.9640,  0.9904,  0.9975,  0.9993])

Explanation

The above code is a visualization concept that imports NumPy and Pytorch Library, and matplotlib is used to plot tanh values. Here the vector size is 13 and ranges from -4 to 4. Finally, a plot is built, shown in the graph below.

PyTorch Activation Function output 3

Output

output 4

Example-4

Using RELU

input= torch.tensor([-2.1, -1, 0, 1, 2.1])
layer1 = nn.ReLU()
output = layer1(input)
print(output)

Output

output 5

Conclusion

That’s how we calculate the Activation function in PyTorch. When we have numerous layers in our neural network, i.e., hidden layers, it becomes even more powerful. Therefore, choosing the proper activation function for each layer is critical and can significantly affect learning speed.

Recommended Articles

This is a guide to PyTorch Activation Function. Here we discuss the different types of Activation layers with examples and outputs. You may also look at the following articles to learn more –

  1. PyTorch Detach
  2. What is PyTorch?
  3. PyTorch GPU
  4. Dataset Pytorch
0 Shares
Share
Tweet
Share
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Java Tutorials
  • Python Tutorials
  • All Tutorials
Certification Courses
  • All Courses
  • Software Development Course - All in One Bundle
  • Become a Python Developer
  • Java Course
  • Become a Selenium Automation Tester
  • Become an IoT Developer
  • ASP.NET Course
  • VB.NET Course
  • PHP Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Software Development Course

C# Programming, Conditional Constructs, Loops, Arrays, OOPS Concept

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Software Development Course

Web development, programming languages, Software testing & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more