EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 600+ Courses All in One Bundle
  • Login
Home Software Development Software Development Tutorials PyTorch Tutorial PyTorch One Hot Encoding
Secondary Sidebar
PyTorch Tutorial
  • PyTorch
    • PyTorch Image Classification
    • PyTorch Random
    • PyTorch Variable
    • PyTorch Activation Function
    • Python Formatted String
    • PyTorch GPU
    • PyTorch CUDA
    • PyTorch DataLoader
    • PyTorch LSTM
    • PyTorch Pad
    • PyTorch OpenCL
    • PyTorch Lightning
    • PyTorch SoftMax
    • PyTorch Flatten
    • PyTorch gan
    • PyTorch max
    • PyTorch pip
    • PyTorch Parameter
    • PyTorch Load Model
    • PyTorch Distributed
    • PyTorch BERT
    • PyTorch interpolate
    • PyTorch JIT
    • PyTorch expand
    • PyTorch AMD
    • PyTorch GRU
    • PyTorch rnn
    • PyTorch permute
    • PyTorch argmax
    • PyTorch SGD
    • PyTorch nn
    • PyTorch One Hot Encoding
    • PyTorch Tensors
    • What is PyTorch?
    • PyTorch MSELoss()
    • PyTorch NLLLOSS
    • PyTorch MaxPool2d
    • PyTorch Pretrained Models
    • PyTorch Squeeze
    • PyTorch Reinforcement Learning
    • PyTorch zero_grad
    • PyTorch norm
    • PyTorch VAE
    • PyTorch Early Stopping
    • PyTorch requires_grad
    • PyTorch MNIST
    • PyTorch Conv2d
    • Dataset Pytorch
    • PyTorch tanh
    • PyTorch bmm
    • PyTorch profiler
    • PyTorch unsqueeze
    • PyTorch adam
    • PyTorch backward
    • PyTorch concatenate
    • PyTorch Embedding
    • PyTorch Tensor to NumPy
    • PyTorch Normalize
    • PyTorch ReLU
    • PyTorch Autograd
    • PyTorch Transpose
    • PyTorch Object Detection
    • PyTorch Autoencoder
    • PyTorch Loss
    • PyTorch repeat
    • PyTorch gather
    • PyTorch sequential
    • PyTorch U-NET
    • PyTorch Sigmoid
    • PyTorch Neural Network
    • PyTorch Quantization
    • PyTorch Ignite
    • PyTorch Versions
    • PyTorch TensorBoard
    • PyTorch Dropout
    • PyTorch Model
    • PyTorch optimizer
    • PyTorch ResNet
    • PyTorch CNN
    • PyTorch Detach
    • Single Layer Perceptron
    • PyTorch vs Keras
    • torch.nn Module

PyTorch One Hot Encoding

PyTorch One Hot Encoding

Introduction to PyTorch One Hot Encoding

The following article provides an outline for PyTorch One Hot Encoding. PyTorch provides different types of functionality to implement deep learning, in which one hot() is one of the functions that PyTorch provides. Basically, one hot() function is used to convert the class indices into a one-hot encoded target value. In machine learning, sometimes we need to convert the given tensor into a one-hot encoding; at that time, we can use one hot() as per requirement. We can also use the one-hot encoding() function if we have more than one dimension index in the tensor. Normally we can say that if we need to implement the deep learning algorithm efficiently, then we need to use such a function in our model.

PyTorch one hot encoding overviews

A one-hot encoding is a portrayal of unmitigated factors as parallel vectors. This initially necessitates that the unmitigated qualities be planned to whole number qualities. Then, at that point, every number value is addressed as a parallel vector that is each of the zero qualities aside from the list of the number, which is set apart with a 1. In other words, we can say that PyTorch has an one_hot() work for changing class records over to one-hot encoded targets. On the off chance that you have more than one angle in your gathering record tensor, one_hot() will encode names along with the last pivot. If you need to turn around the activity, changing a tensor from one-hot encoding to class lists, utilize the .argmax() strategy in the course of the last index. One hot encoding is a decent stunt to know about in PyTorch, however, realize that you don’t really require this assuming you’re fabricating a classifier with cross-entropy misfortune. All things considered, simply handling the class list focuses on the misfortune capacity, and PyTorch will deal with the rest.

A one-hot encoding permits the portrayal of downright information to be more expressive.

Many AI calculations can’t work with downright information straightforwardly. Instead, the classifications should be changed over into numbers. This is needed for both info and result factors that are absolute.

Start Your Free Software Development Course

Web development, programming languages, Software testing & others

We could utilize a number encoding straightforwardly, rescaled where required. This might work for issues where there is a characteristic ordinal connection between the classifications, and thus the number of qualities, for example, marks for temperature ‘cold,’ ‘warm,’ and ‘hot.’

It creates a problem when there is no relationship and data representation that might be hurting to sort out some way to deal with the issue. For example, a model may be the names ‘canine’ and ‘feline.’

In these cases, we might want to give the organization a more expressive ability to get familiar with likelihood-like numbers for every conceivable name esteem. By using the above method, we can solve this problem, and on the other hand, we have one hot encoding function to resolve the problem.

All in One Software Development Bundle(600+ Courses, 50+ projects)
Python TutorialC SharpJavaJavaScript
C Plus PlusSoftware TestingSQLKali Linux
Price
View Courses
600+ Online Courses | 50+ projects | 3000+ Hours | Verifiable Certificates | Lifetime Access
4.6 (86,560 ratings)

Creating PyTorch one-hot encoding

Now let’s see how we can create one hot encoding() function as follows.

import torch
import torch.nn.functional as Fun
A = torch.tensor([3, 4, 5, 0, 1, 2])
output = Fun.one_hot(A, num_classes = 7)
print(output)

Explanation

In the above example, we try to implement the one hot() encoding function as shown here first; we import all required packages, such as a torch. After that, we created a tenor with different values, and finally, we applied the one hot() function as shown. The last result of the above execution we showed by utilizing the accompanying screen capture is as follows.

PyTorch One Hot Encoding 1

Now let’s see another example of one hot() function as follows.

import torch
import torch.nn.functional as Fun
A = torch.rand(3, 14, 126)
output = Fun.one_hot(A.argmax(dim = 2), 126)
print(output)

Explanation

In the above example, we try to implement the one hot() encoding function as shown here first; we import all required packages such as a torch. After that, we created a tenor by using the rand() function as shown, and finally, we applied the one hot() function with the argmax() function as shown. The last result of the above execution we showed by utilizing the accompanying screen capture is as follows.

PyTorch One Hot Encoding 2

Now let’s see an example of manual one hot encoding as follows.

from numpy import argmax
i_string = 'welcome in one hot function'
print(i_string)
char_al = 'abcdefghijklmnopqrstuvwxyz '
character_to_integer = dict((c, i) for i, c in enumerate(char_al))
integer_to_character = dict((i, c) for i, c in enumerate(char_al))
integer_encoded = [character_to_integer[char] for char in i_string] print(integer_encoded)
onehot_en = list()
for value in integer_encoded:
l = [0 for _ in range(len(char_al))] l[value] = 1
onehot_en.append(l)
print(onehot_en)
invert = integer_to_character[argmax(onehot_en[0])] print(invert)

Explanation

In the above example, we try to implement the manual one-hot encoding. In this example, we first define the input string and the entire alphabet for possible input values. After that, we use one hot encode () function as shown. Finally, we print the inverse of the input string by using the argmax () function. The last result of the above execution we showed by utilizing the accompanying screen capture is as follows.

PyTorch One Hot Encoding 3

Now let’s see an example of scikit learn with one hot encode as follows.

from numpy import array
from numpy import argmax
from sklearn.preprocessing import LabelEncoder
from sklearn.preprocessing import OneHotEncoder
input = ['not', 'yes', 'not', 'not', 'yes', 'yes', 'not', 'not'] value = array(input)
print(value)
l_encoder = LabelEncoder()
i_encoded = l_encoder.fit_transform(value)
print(i_encoded)
o_encoder = OneHotEncoder(sparse=False)
i_encoded = i_ncoded.reshape(len(i_encoded), 1)
o_encoded = o_encoder.fit_transform(i_encoded)
print(o_encoded)
inverted = l_encoder.inverse_transform([argmax(o_encoded[0, :])])
print(inverted)

Explanation

In the above example, we try to implement the one hot() function with scikit learns, as shown. This example shows that we use an integer encoder and binary encoder. The last result of the above execution we showed by utilizing the accompanying screen capture is as follows.

Output 4

So in this way, we can use one hot() encode function as per our requirement.

Conclusion

We hope from this article you learn more about the PyTorch one-hot encoding. From the above article, we have taken in the essential idea of the PyTorch one-hot encoding, and we also see the representation and example of the PyTorch one-hot encoding. Furthermore, we learned how and when we use the PyTorch one-hot encoding from this article.

Recommended Articles

This is a guide to PyTorch One Hot Encoding. Here we discuss the overview and steps to create PyTorch One Hot Encoding along with examples and code. You may also have a look at the following articles to learn more –

  1. What is PyTorch?
  2. PyTorch Versions
  3. Python object type
  4. Python string append
0 Shares
Share
Tweet
Share
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Java Tutorials
  • Python Tutorials
  • All Tutorials
Certification Courses
  • All Courses
  • Software Development Course - All in One Bundle
  • Become a Python Developer
  • Java Course
  • Become a Selenium Automation Tester
  • Become an IoT Developer
  • ASP.NET Course
  • VB.NET Course
  • PHP Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Software Development Course

C# Programming, Conditional Constructs, Loops, Arrays, OOPS Concept

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Software Development Course

Web development, programming languages, Software testing & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more