EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 600+ Courses All in One Bundle
  • Login
Home Software Development Software Development Tutorials PyTorch Tutorial PyTorch tanh
Secondary Sidebar
PyTorch Tutorial
  • PyTorch
    • PyTorch Image Classification
    • PyTorch Random
    • PyTorch Variable
    • PyTorch Activation Function
    • Python Formatted String
    • PyTorch GPU
    • PyTorch CUDA
    • PyTorch DataLoader
    • PyTorch LSTM
    • PyTorch Pad
    • PyTorch OpenCL
    • PyTorch Lightning
    • PyTorch SoftMax
    • PyTorch Flatten
    • PyTorch gan
    • PyTorch max
    • PyTorch pip
    • PyTorch Parameter
    • PyTorch Load Model
    • PyTorch Distributed
    • PyTorch BERT
    • PyTorch interpolate
    • PyTorch JIT
    • PyTorch expand
    • PyTorch AMD
    • PyTorch GRU
    • PyTorch rnn
    • PyTorch permute
    • PyTorch argmax
    • PyTorch SGD
    • PyTorch nn
    • PyTorch One Hot Encoding
    • PyTorch Tensors
    • What is PyTorch?
    • PyTorch MSELoss()
    • PyTorch NLLLOSS
    • PyTorch MaxPool2d
    • PyTorch Pretrained Models
    • PyTorch Squeeze
    • PyTorch Reinforcement Learning
    • PyTorch zero_grad
    • PyTorch norm
    • PyTorch VAE
    • PyTorch Early Stopping
    • PyTorch requires_grad
    • PyTorch MNIST
    • PyTorch Conv2d
    • Dataset Pytorch
    • PyTorch tanh
    • PyTorch bmm
    • PyTorch profiler
    • PyTorch unsqueeze
    • PyTorch adam
    • PyTorch backward
    • PyTorch concatenate
    • PyTorch Embedding
    • PyTorch Tensor to NumPy
    • PyTorch Normalize
    • PyTorch ReLU
    • PyTorch Autograd
    • PyTorch Transpose
    • PyTorch Object Detection
    • PyTorch Autoencoder
    • PyTorch Loss
    • PyTorch repeat
    • PyTorch gather
    • PyTorch sequential
    • PyTorch U-NET
    • PyTorch Sigmoid
    • PyTorch Neural Network
    • PyTorch Quantization
    • PyTorch Ignite
    • PyTorch Versions
    • PyTorch TensorBoard
    • PyTorch Dropout
    • PyTorch Model
    • PyTorch optimizer
    • PyTorch ResNet
    • PyTorch CNN
    • PyTorch Detach
    • Single Layer Perceptron
    • PyTorch vs Keras
    • torch.nn Module

PyTorch tanh

PyTorch tanh

Definition of PyTorch tanh

The hyperbolic tangent function also abbreviated as tanh is one of several activation functions. It is defined as, the hyperbolic tangent function having an average range of (-1, 1), therefore highly negative inputs are mapped to negative numbers.

What is PyTorch tanh?

PyTorch is a Facebook-developed open-source framework and machine learning library. Deep neural networks and natural language processing are two applications of it. Activation functions accept any integer as an input and convert it to an output. Researchers can employ nonlinear functions to achieve this goal because any function could be used as an activation function.

The following are characteristics of an ideal activation function:

All in One Software Development Bundle(600+ Courses, 50+ projects)
Python TutorialC SharpJavaJavaScript
C Plus PlusSoftware TestingSQLKali Linux
Price
View Courses
600+ Online Courses | 50+ projects | 3000+ Hours | Verifiable Certificates | Lifetime Access
4.6 (86,560 ratings)

Nonlinearity — The activation function must be able to introduce nonlinearity in neural networks, particularly in hidden layer neurons. This is because straight connections are hardly used to illustrate real-world circumstances.

Start Your Free Software Development Course

Web development, programming languages, Software testing & others

Differentiable Activation Function – The activation function must be discrete. The network can learn by back-propagating loss from the output nodes to hidden units during the training stage. The backpropagation technique adjusts the weights of neurons in hidden layers using the derivative of their activation functions to reduce error in the following training period.

PyTorch tanh method

The most widely used activation functions are included in the Pytorch library. Layers (or classes) and definitions are provided for these activation functions (or functions). Tanh is a scaled sigmoid that has the same problems with gradient as the original sigmoid function. Model difficulties with vanishing gradient problems can be mitigated by varying weights.

Pytorch tanh is divided based on the output it produces i.e between -1 and 1 respectively. At every point, the hyperbolic tangent feature may be differentiated, and its derivative is 1 – tanh2(x). Because the expression uses the tanh function, the return of that function can be utilized to speed up reverse propagation. Tanh is similar to Sigmoid except that it is centered and ranges from -1 to 1. The function’s result will have a zero mean. As a result, the system will converge more quickly. It’s worth noting that if the mean of each input parameter is near 0, divergence is probably quicker. Batch Normalization is an example.

The class goes like this:

Class torch. nn. Tanh

The entity-wise hyperbolic tangent will indeed be generated if the input comprises several items.

torch. tanh (x, out=None)

Here, x: name of the input tensor

(optional): name of the output tensor

Tensor of the same type as x as the return type.

Tanh Activation Function Benefits

Tanh’s activation function is non-linear and discrete, both of which are desirable properties for an activation function. An activation function is defined as

class ActivationFunction(nn.Module):
def __init__(self):
super().__init__()
self.name = self.__class__.__name__
self.config = {"name": self.name}

It could be used to convert the outputs of a neuron to a negative number because its output varies from +1 to -1.

tanh(x)=(exp(x)-exp(-x))/(exp(x)+exp(-x))

class Tanh(ActivationFunction):
def forward(self, x):
x_exp, neg_x_exp = torch.exp(x), torch.exp(-x)
return (x_exp - neg_x_exp) / (x_exp + neg_x_exp)

Tanh Activation function for the free Forward Neural Network.

Here the Hidden layers with linear and non-linear are depicted. The Output shows no. of iteration, hidden layers and Accuracy.

import torch
import torch.nn as nn
import torchvision.transforms as transforms
import torchvision.datasets as dsets
train_dataset = dsets.MNIST(root='./data',
train=True,
transform=transforms.ToTensor(),
download=True)
test_dataset = dsets.MNIST(root='./data',
train=False,
transform=transforms.ToTensor())
class FeedforwardNeuralNetModel(nn.Module):
def __init__(self, input_dim, hidden_dim, output_dim):
super(FeedforwardNeuralNetModel, self).__init__()
self.f1 = nn.Linear(input_dim, hidden_dim)
self.tanh = nn.Tanh()
self.f2 = nn.Linear(hidden_dim, output_dim)
def forward (self, x):
out = self.f1(x)
out = self.tanh(out)
out = self.f2(out)
return out

PyTorch tanh function

In PyTorch, the function torch.tanh() supports the hyperbolic tangent function. The inputs must be in radian type, and the result must be in the range [-∞,∞]. Well here the input is a tensor, and if there are several elements in the input, entity hyperbolic tangential is generated. The Tanh() activation function is loaded once more using the nn package. Then, to obtain the result, random data is being generated and transferred.

Tanh function is called by
import torch.nn
tanh = nn.Tanh()
input = torch.randn(2)
output = tanh(input)
Let’s see sample code here:
import torch
x=torch.rand(4,2)
print(x)

Output:

A

And lastly, we’ll illustrate each activation function in the following diagram to gain a better grasp of what they perform. The gradient of the function, in addition to the actual activation value, is a significant factor for optimizing the neural network. We can calculate the gradients in PyTorch by invoking the reverse function.

PyTorch tanh Examples

Example #1

import torch
x=torch.FloatTensor([2.0,-0.4,1.1,-2.0,-5.4])
print(x)
y=torch.tanh(x)
print(y)

Output:

B

Example #2

import torch
import numpy as np
import matplotlib.pyplot as plt
m = np.linspace ( - 4 , 4 , 13 )
n= torch.tanh (torch.FloatTensor (m))
print (n)
plt. plot (m, n.numpy (), color = 'green' , marker = "o" )
plt.title ( "torch.tanh" )
plt.xlabel ( "X" )
plt.ylabel ( "Y" )
plt.show ()
print (n)
tensor ([-0.9993, -0.9975, -0.9904, -0.9640, -0.8701, -0.5828, 0.0000, 0.5828, 0.8701, 0.9640, 0.9904, 0.9975, 0.9993])

Explanation

The above code is a visualization concept which imports Numpy and Pytorch Library and matplotlibrary is used for plotting tanh values. Here the vector size is 13 and ranges from -4 to 4. Finally, a plot is build which is shown in the below graph.

C

Output:

PyTorch tanh 2

Example #3: Activation function tanh() using R Code

library(rTorch)
library(ggplot2)
a <- torch$range(-4., 4., 0.2)
b <- torch$tanh(a)
dfr <- data.frame(a = a$numpy(), sx = a$numpy())
dfr
ggplot(dfr, aes(a = a, b = sx)) +
geom_point() +
ggtitle("tanh")

Output:

PyTorch tanh R code

Example #4

import numpy as np
import torch
data = [1.2, -3.2, 4.3, 0.5, -0.6, -1.8] def tanh(x):
lists = list ()
for j in range(len(x)):
lists.append((np.exp(x[k]) - np.exp(-x[k])) / (np.exp(x[k]) + np.exp(-x[k])))
return lists
def tanh_derivative(x):
return 1 - np.power(tanh(x), 2)
out = [round (value, 3) for value in tanh(data)] print (" Result of numpy tanh:", out)
print ("Result of numpt tanh derivative:", [round (value, 3) for value in tanh_derivative(data)])
print ("result of derivative2:", [round(1. - value*value, 3) for value in tanh(data)])
input = torch.FloatTensor(data)
a = torch.nn.Tanh()
out2= m(input)
print("pytorch tanh:", out2)
print("pytorch tanh derivative:", 1. - output2*output2)

Explanation

Pytorch Implementation is shown above and the result is given as:

Output:

PyTorch tanh

Conclusion

As a result, we were able to understand how to utilize tanh() in PyTorch and its derivatives using straightforward explanations and code examples.

Recommended Articles

This is a guide to PyTorch tanh. Here we discuss the definition, What is PyTorch tanh, PyTorch tanh method, Examples with code implementation. You may also have a look at the following articles to learn more –

  1. PyTorch CUDA
  2. PyTorch Detach
  3. pytorch gan
  4. PyTorch Flatten
Popular Course in this category
Machine Learning Training (20 Courses, 29+ Projects)
  19 Online Courses |  29 Hands-on Projects |  178+ Hours |  Verifiable Certificate of Completion
4.7
Price

View Course
0 Shares
Share
Tweet
Share
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Java Tutorials
  • Python Tutorials
  • All Tutorials
Certification Courses
  • All Courses
  • Software Development Course - All in One Bundle
  • Become a Python Developer
  • Java Course
  • Become a Selenium Automation Tester
  • Become an IoT Developer
  • ASP.NET Course
  • VB.NET Course
  • PHP Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Software Development Course

C# Programming, Conditional Constructs, Loops, Arrays, OOPS Concept

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Software Development Course

Web development, programming languages, Software testing & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more