EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 600+ Courses All in One Bundle
  • Login
Home Software Development Software Development Tutorials PyTorch Tutorial PyTorch Lightning
Secondary Sidebar
PyTorch Tutorial
  • PyTorch
    • PyTorch Image Classification
    • PyTorch Random
    • PyTorch Variable
    • PyTorch Activation Function
    • Python Formatted String
    • PyTorch GPU
    • PyTorch CUDA
    • PyTorch DataLoader
    • PyTorch LSTM
    • PyTorch Pad
    • PyTorch OpenCL
    • PyTorch Lightning
    • PyTorch SoftMax
    • PyTorch Flatten
    • PyTorch gan
    • PyTorch max
    • PyTorch pip
    • PyTorch Parameter
    • PyTorch Load Model
    • PyTorch Distributed
    • PyTorch BERT
    • PyTorch interpolate
    • PyTorch JIT
    • PyTorch expand
    • PyTorch AMD
    • PyTorch GRU
    • PyTorch rnn
    • PyTorch permute
    • PyTorch argmax
    • PyTorch SGD
    • PyTorch nn
    • PyTorch One Hot Encoding
    • PyTorch Tensors
    • What is PyTorch?
    • PyTorch MSELoss()
    • PyTorch NLLLOSS
    • PyTorch MaxPool2d
    • PyTorch Pretrained Models
    • PyTorch Squeeze
    • PyTorch Reinforcement Learning
    • PyTorch zero_grad
    • PyTorch norm
    • PyTorch VAE
    • PyTorch Early Stopping
    • PyTorch requires_grad
    • PyTorch MNIST
    • PyTorch Conv2d
    • Dataset Pytorch
    • PyTorch tanh
    • PyTorch bmm
    • PyTorch profiler
    • PyTorch unsqueeze
    • PyTorch adam
    • PyTorch backward
    • PyTorch concatenate
    • PyTorch Embedding
    • PyTorch Tensor to NumPy
    • PyTorch Normalize
    • PyTorch ReLU
    • PyTorch Autograd
    • PyTorch Transpose
    • PyTorch Object Detection
    • PyTorch Autoencoder
    • PyTorch Loss
    • PyTorch repeat
    • PyTorch gather
    • PyTorch sequential
    • PyTorch U-NET
    • PyTorch Sigmoid
    • PyTorch Neural Network
    • PyTorch Quantization
    • PyTorch Ignite
    • PyTorch Versions
    • PyTorch TensorBoard
    • PyTorch Dropout
    • PyTorch Model
    • PyTorch optimizer
    • PyTorch ResNet
    • PyTorch CNN
    • PyTorch Detach
    • Single Layer Perceptron
    • PyTorch vs Keras
    • torch.nn Module

PyTorch Lightning

PyTorch Lightning

Introduction to PyTorch Lightning

A library available in Python language for free where the interference happens with a deep learning framework, PyTorch, is called PyTorch Lightning. The code is organized so that different experiments can be created and restructured with various inputs. Furthermore, scalable models in deep learning can be created easily using this library, where these models can be kept intact without making any contact with the hardware. This was initially released in 2019 May and can be used on multiple platforms. Though William Falcon is the original author, there are various developers, and hence the credit cannot be given to one person alone.

What is PyTorch Lightning?

PyTorch Lightning is an AI research tool mostly preferred for its high performance where deep learning boilerplate can be abstracted easily so that we have control over the code we are writing in Python. Lightning helps to scale the models, and with this, code enhancement can be done based on our requirement, and this will not scale the boilerplate. A structure is given to the research code in all the ways by the Lightning module with the help of indices and many other components.

There are five sections for organizing code into the Lightning module. They are computations, train loop, validation loop, test loop, and optimizers. First, we have to init to define the computations and forward them to know where the code is pointing to from one end. Then, Training_step is the full training loop of the code, and validation_step is the full validation loop of the code. Similarly, we have test_step for the full testing loop and configure_optimizers to explain the module’s optimizers and schedulers.

Typical Project

Lightning transformers are used as an interface for training transformer models based on SOTA. We can use Lightning callbacks, accelerators, or loggers that help in better performance for training the data. Speed optimizations such as DeepSpeed ZeRo and FairScale Sharded Training can be used to enhance memory and improve performance. We can swap the models and add more configurations based on optimizers and schedulers using Hydra, a config composition. We also have an option of building from scratch with the help of transformer task abstraction that helps in the research and experimentation of the code.

Start Your Free Software Development Course

Web development, programming languages, Software testing & others

These tasks help to train models with transformer models and datasets, or we can use Hydra to swap the models. The lighting module has several options like callbacks, accelerators, scaling, and many other advantages that help in managing the code based on requirements and customizations.

Here a project about lightning transformers is considered into focus.

The first step is to install the module.

pip install lightning-transformers

Now we must take the code from the source.

git clone https://github.com/PyTorchLightning/lightning-transformers.git
cd lightning-transformers
pip install .

PyTorch Lightning – Model

We can design multi-layered neural networks using PyTorch Lightning.

All in One Software Development Bundle(600+ Courses, 50+ projects)
Python TutorialC SharpJavaJavaScript
C Plus PlusSoftware TestingSQLKali Linux
Price
View Courses
600+ Online Courses | 50+ projects | 3000+ Hours | Verifiable Certificates | Lifetime Access
4.6 (86,328 ratings)

import torch
from torch.nn import functional as Fun
from torch import nn
from pytorch_lightning.core.lightning import LightningModule
class LitMNIST(LightningModule):
def __init__(self):
super().__init__()
self.layer_1 = nn.Linear(14 * 14, 144)
self.layer_2 = nn.Linear(144, 288)
self.layer_3 = nn.Linear(288, 10)
def forward(self, x):
batch_size, channels, height, width = x.size()
x = x.view(batch_size, -1)
x = self.layer_1(x)
x = Fun.relu(x)
x = self.layer_2(x)
x = Fun.relu(x)
x = self.layer_3(x)
x = Fun.log_softmax(x, dim=1)
return x

We can use a Lightning module like the PyTorch module and make necessary changes. Here we are using the MNIST dataset.

class LitMNIST(LightningModule):
def training_step(self, batch, batch_idx):
x, y = batch
logits = self(x)
loss = Fun.nll_loss(logits, y)
return loss

PyTorch Lightning – Data

DataLoader is needed for Lightning modules to operate. The following code explains the data using the MNIST dataset.

from torch.utils.data import DataLoader, random_split
from torchvision.datasets import MNIST
import os
from torchvision import datasets, transforms
transform = transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))])
mnist_train = MNIST(os.getcwd(), train=True, download=True, transform=transform)
mnist_train = DataLoader(mnist_train, batch_size=64)

DataLoaders can be used in different ways in the Lightning module. For example, the fit function can be used in the dataloader.

model = LitMNIST()
trainer = Trainer()
trainer.fit(model, mnist_train)

We can use a lightning module inside DataLoaders for the fast processing of data in research models.

class LitMNIST(pl.LightningModule):
def train_dataloader(self):
transform = transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))])
mnist_train = MNIST(os.getcwd(), train=True, download=True, transform=transform)
return DataLoader(mnist_train, batch_size=64)
def val_dataloader(self):
transforms = ...
mnist_val = ...
return DataLoader(mnist_val, batch_size=64)
def test_dataloader(self):
transforms = ...
mnist_test = ...
return DataLoader(mnist_test, batch_size=64)

We can use Datasets inside DataLoaders and make it functional without any additional information.

class MyDataModule(LightningDataModule):
def __init__(self):
super().__init__()
self.train_dims = None
self.vocab_size = 0
def prepare_data(self):
download_dataset()
tokenize()
build_vocab()
def setup(self, stage: Optional[str] = None):
vocab = load_vocab()
self.vocab_size = len(vocab)
self.train, self.val, self.test = load_datasets()
self.train_dims = self.train.next_batch.size()
def train_dataloader(self):
transforms = ...
return DataLoader(self.train, batch_size=64)
def val_dataloader(self):
transforms = ...
return DataLoader(self.val, batch_size=64)
def test_dataloader(self):
transforms = ...
return DataLoader(self.test, batch_size=64)

Dataset definitions can be easily fetched from the data modules.

mnist_dm = MNISTDatamodule()
model = LitModel(num_classes=mnist_dm.num_classes)
trainer.fit(model, mnist_dm)
imagenet_dm = ImagenetDatamodule()
model = LitModel(num_classes=imagenet_dm.num_classes)
trainer.fit(model, imagenet_dm)

PyTorch Lightning examples

Initially, we must install PyTorch and give the model format so that PyTorch will be aware of the dataset present in the code. Then, we should add the training details, scheduler, and optimizer in the model and present them in the code. Finally, we can load the data using the following code.

import pytorch-lightning as pylight
from torchvision import datasets,transforms
from torch.utils.data import DataLoader
class Data(pl.LightningDataModule):
def prepare_data(self):
transform=transforms.Compose([
transforms.ToTensor()
])
self.train_data = datasets.MNIST('', train=True, download=True, transform=transform)
self.test_data = datasets.MNIST('', train=False, download=True, transform=transform)
def train_dataloader(self):
return DataLoader(self.train_data, batch_size= 32, shuffle=True)
def val_dataloader(self):
return DataLoader(self.test_data, batch_size= 32, shuffle=True)
class model(pl.LightningModule):
def __init__(self):
super(model,self).__init__()
self.fc1 = nn.Linear(28*28,256)
self.fc2 = nn.Linear(256,128)
self.out = nn.Linear(128,10)
self.lr = 0.01
self.loss = nn.CrossEntropyLoss()
def configure_optimizers(self):
return SGD(self.parameters(),lr = self.lr)
def training_step(self, train_batch, batch_idx):
x, y = train_batch
logits = self.forward(x)
loss = self.loss(logits,y)
return loss

Conclusion

It is easy to use the Lightning module as the readability is more where it avoids all the engineering code and uses only the known modules in Python. Moreover, it is easy to track the code changes, and hence the reproducibility is easy in PyTorch Lightning. Also, lightning helps to run codes in GPU, CPU, and clusters without any additional management.

Recommended Articles

This is a guide to PyTorch Lightning. Here we discuss What PyTorch Lightning is along with the Typical Project and examples. You may also have a look at the following articles to learn more –

  1. Python string append
  2. Python Raw String
  3. Python Widgets
  4. CGI in Python
0 Shares
Share
Tweet
Share
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Java Tutorials
  • Python Tutorials
  • All Tutorials
Certification Courses
  • All Courses
  • Software Development Course - All in One Bundle
  • Become a Python Developer
  • Java Course
  • Become a Selenium Automation Tester
  • Become an IoT Developer
  • ASP.NET Course
  • VB.NET Course
  • PHP Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Software Development Course

C# Programming, Conditional Constructs, Loops, Arrays, OOPS Concept

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Software Development Course

Web development, programming languages, Software testing & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more