EDUCBA

EDUCBA

MENUMENU
  • Explore
    • Lifetime Membership
    • All in One Bundles
    • Fresh Entries
    • Finance
    • Data Science
    • Programming and Dev
    • Excel
    • Marketing
    • HR
    • PDP
    • VFX and Design
    • Project Management
    • Exam Prep
    • All Courses
  • Blog
  • Enterprise
  • Free Courses
  • Login
Home Software Development Software Development Tutorials PyTorch Tutorial PyTorch permute

PyTorch permute

Updated April 5, 2023

PyTorch permute

Introduction to PyTorch Permute

PyTorch provides the different types of functionality to the user, in which that permute is one of the functionalities that the PyTorch provides. For example, in deep learning, sometimes we need to rearrange the original tensor as per the specified order and return a new multidimensional tensor; at that time, we can use the permute() function as per our requirement. The main advantage of the permute() function is that the size of a returned tensor is the same as the size of the original tensor, which means it remains the same. In other words, we can say that the permute() function is faster than PyTorch as well as we can also implement deep learning inefficiently as per our requirement.

Start Your Free Software Development Course

Web development, programming languages, Software testing & others

What is PyTorch Permute?

Translate/Permute operation can be found in the models of Transformer, which overwhelms the NLP, and Vision Transformer, which is a rising star in the field of CV. Particularly in Multi-Head Attention, this operation is expected to change the information aspect course of action.

Clearly, as an exceptionally utilized operation, the CUDA execution of Transpose/Permute operation influences the preparation speed of the real organization. The outcomes show that the profoundly streamlined Permute activity is a lot quicker and more transmission capacity viable than PyTorch, and the transfer speed use is near that of the local Copy activity.

How to use PyTorch permute?

Now let’s see how we can use of permute() function in PyTorch as follows.

In the above point, we already discussed the permute() function. Now let’s see how we can implement the permute() function as follows.

Syntax

torch.permute(specified input, specified dimension)

Explanation

In the above syntax, we use of permute() function with two different parameters, as shown.

  • Specified input: Specified input means input tensor; we can create the tensor by using the randn() function with different values.
  • Specified dimension: The specified dimension means the specified order of tensor dimension and depends on the user requirement.

PyTorch Permute Elements

Now let’s see different elements of permute() function as follows.

Inputs: Contribution for which change attributions are registered. If forward_func accepts a solitary tensor as info, a solitary information tensor ought to be given. In the event that forward_func accepts various tensors as info, a tuple of the information tensors ought to be given. It is expected that for all given info tensors, aspect 0 relates to the number of models (also known as bunch size), and assuming various information tensors are given, the models should be adjusted properly.

Target: Yield lists for which distinction is processed (for order cases, this is typically the objective class). Assuming that the organization returns a scalar worth for each model, no objective record is essential. For general 2D results, targets can be by the same token:

  • A solitary number or a tensor containing a solitary whole number, which is applied to all information models
  • A rundown of whole numbers or a 1D tensor, with length coordinating with the number of models in inputs (faint 0). Every number is applied as the objective for the comparing model.

For yields with > 2 aspects, targets can be by the same token:

  • A solitary tuple, which contains output_dims – 1 component. This objective list is applied to all models.
  • There is a rundown of tuples with length equivalent to the number of models in inputs (faint 0), and each tuple contains output_dims – 1 component. Each tuple is applied as the objective for the comparing model.

additional_forward_args: This contention can be given if the forward work requires extra contentions other than the contributions for which attributions ought not to be processed. It should be either an additional conflict of a Tensor or emotional (non-tuple) type or a tuple containing various extra contentions, including tensors or any discretionary python types. These contentions are given to forward_func all together after the contentions in inputs. For example, for a tensor, the principal aspect of the tensor should compare to the number of models. The given contention is utilized for all forward assessments for any remaining kinds.

PyTorch permute method

Different methods are mentioned below:

  • Naive Permute Implementation: The capacity of Permute is to change the request for tensor information aspects.
  • Static Dispatch of IndexType:As profound learning models get bigger, the number of components associated with the activity might surpass the reach addressed by int32_t. Furthermore, in the facilitated stage, the division activity has various overheads for various whole number sorts.
  • Merging Redundant Dimensions:In some uncommon cases, Permute aspects can be converged, with the accompanying guidelines:
    • For example, aspects of size 1 can be taken out straightforwardly.
    • Likewise, back-to-back aspects can be converted into one aspect.

Using Greater Access Granularity

You might have noticed a format boundary size_t movement size in the piece work, which shows the granularity of the components to be gotten too. The Nvidia Performance Optimization blog Increase Performance with vectorized Memory Access referenced that CUDA Kernel execution can be improved by vectorizing memory tasks to decrease the number of guidelines and further develop data transfer capacity use.

Now let’s see the different examples of the permute() function for better understanding as follows.

Example #1

Code:

import torch
A = torch.randn(3, 5, 2)
A.size()
torch.Size([3, 4, 6])
output = torch.permute(A, (1, 0, 2)).size()
print(output)

Explanation

In the above example, we try to implement the permute() function; here, we created a tensor by using the randn function, and after that, we use the permute() function as shown. The final output of the above implementation we illustrated by using the following screenshot as follows.

PyTorch permute output 1

Example #2

Now let’s see another example of the permute() function as follows.

Code:

import torch
input_tensor = torch.randn(2,3)
print(input_tensor.size())
print(input_tensor)
input_tensor = input_tensor.permute(1, 0)
print(input_tensor.size())
print(input_tensor)

Explanation

The final output of the above implementation we illustrated by using the following screenshot as follows.

PyTorch permute output 2

Conclusion

We hope from this article you learn more about the PyTorch permute. From the above article, we have taken in the essential idea of the PyTorch permute, and we also see the representation and example of the PyTorch permute. Furthermore, we learned how and when we use the PyTorch permute from this article.

Recommended Articles

We hope that this EDUCBA information on “PyTorch permute” was beneficial to you. You can view EDUCBA’s recommended articles for more information.

  1. PyTorch Conv2d
  2. What is PyTorch?
  3. Dataset Pytorch
  4. PyTorch Versions
ADVERTISEMENT
All in One Excel VBA Bundle
500+ Hours of HD Videos
15 Learning Paths
120+ Courses
Verifiable Certificate of Completion
Lifetime Access
ADVERTISEMENT
Financial Analyst Masters Training Program
2000+ Hours of HD Videos
43 Learning Paths
550+ Courses
Verifiable Certificate of Completion
Lifetime Access
ADVERTISEMENT
All in One Data Science Bundle
2000+ Hour of HD Videos
80 Learning Paths
400+ Courses
Verifiable Certificate of Completion
Lifetime Access
ADVERTISEMENT
All in One Software Development Bundle
5000+ Hours of HD Videos
149 Learning Paths
1050+ Courses
Verifiable Certificate of Completion
Lifetime Access
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Java Tutorials
  • Python Tutorials
  • All Tutorials
Certification Courses
  • All Courses
  • Software Development Course - All in One Bundle
  • Become a Python Developer
  • Java Course
  • Become a Selenium Automation Tester
  • Become an IoT Developer
  • ASP.NET Course
  • VB.NET Course
  • PHP Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2023 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Software Development Course

Web development, programming languages, Software testing & others

By continuing above step, you agree to our Terms of Use and Privacy Policy.
*Please provide your correct email id. Login details for this Free course will be emailed to you

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you
EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you
EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more

🚀 Extended Cyber Monday Price Drop! All in One Universal Bundle (3700+ Courses) @ 🎁 90% OFF - Ends in ENROLL NOW