EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 600+ Courses All in One Bundle
  • Login
Home Software Development Software Development Tutorials PyTorch Tutorial PyTorch Early Stopping
Secondary Sidebar
PyTorch Tutorial
  • PyTorch
    • PyTorch Image Classification
    • PyTorch Random
    • PyTorch Variable
    • PyTorch Activation Function
    • Python Formatted String
    • PyTorch GPU
    • PyTorch CUDA
    • PyTorch DataLoader
    • PyTorch LSTM
    • PyTorch Pad
    • PyTorch OpenCL
    • PyTorch Lightning
    • PyTorch SoftMax
    • PyTorch Flatten
    • PyTorch gan
    • PyTorch max
    • PyTorch pip
    • PyTorch Parameter
    • PyTorch Load Model
    • PyTorch Distributed
    • PyTorch BERT
    • PyTorch interpolate
    • PyTorch JIT
    • PyTorch expand
    • PyTorch AMD
    • PyTorch GRU
    • PyTorch rnn
    • PyTorch permute
    • PyTorch argmax
    • PyTorch SGD
    • PyTorch nn
    • PyTorch One Hot Encoding
    • PyTorch Tensors
    • What is PyTorch?
    • PyTorch MSELoss()
    • PyTorch NLLLOSS
    • PyTorch MaxPool2d
    • PyTorch Pretrained Models
    • PyTorch Squeeze
    • PyTorch Reinforcement Learning
    • PyTorch zero_grad
    • PyTorch norm
    • PyTorch VAE
    • PyTorch Early Stopping
    • PyTorch requires_grad
    • PyTorch MNIST
    • PyTorch Conv2d
    • Dataset Pytorch
    • PyTorch tanh
    • PyTorch bmm
    • PyTorch profiler
    • PyTorch unsqueeze
    • PyTorch adam
    • PyTorch backward
    • PyTorch concatenate
    • PyTorch Embedding
    • PyTorch Tensor to NumPy
    • PyTorch Normalize
    • PyTorch ReLU
    • PyTorch Autograd
    • PyTorch Transpose
    • PyTorch Object Detection
    • PyTorch Autoencoder
    • PyTorch Loss
    • PyTorch repeat
    • PyTorch gather
    • PyTorch sequential
    • PyTorch U-NET
    • PyTorch Sigmoid
    • PyTorch Neural Network
    • PyTorch Quantization
    • PyTorch Ignite
    • PyTorch Versions
    • PyTorch TensorBoard
    • PyTorch Dropout
    • PyTorch Model
    • PyTorch optimizer
    • PyTorch ResNet
    • PyTorch CNN
    • PyTorch Detach
    • Single Layer Perceptron
    • PyTorch vs Keras
    • torch.nn Module

PyTorch Early Stopping

PyTorch Early Stopping

Introduction to PyTorch Early Stopping

Python early stopping is the process of regularizing that has the advantage to avoid the overfitting caused on the data considered for training purpose. In this article, we will have a detail dive into the topic PyTorch early stopping overviews, how to use PyTorch early stopping, implement early PyTorch early stopping, PyTorch early stopping examples, and finally conclude our statement.

PyTorch Early Stopping Overviews

PyTorchtool.py is the library whose EarlyStopping class helps in the creation of an object for keeping a track of all the losses incurred while the validation process. Training is completely stopped when in case if there is a gradual decrease observed in the loss stops of multiple epochs continuously happening in a row. PyTorch early stopping is used for keeping a track of all the losses caused during validation. Whenever a loss of validation is decreased then a new checkpoint is added by the PyTorch model.

Before the training loop was broken when was the last time when there was a slight improvement observed in the validation loss, an argument called patience is set in the class named EarlyStopping just to keep a track of the number of epochs that should be waited after the last improvement in losses of validation.

How to Use PyTorch early stopping?

We can simply early stop a particular epoch by just overriding the function present in the PyTorch library named on_train_batch_start(). This function should return the value -1 only if the specified condition is fulfilled. The complete process of run is stopped if we try to return -1 from on train batch start function on basis of conditions continuously in a repetitive manner if the process is performed for each and every epoch that we originally requested.

Start Your Free Software Development Course

Web development, programming languages, Software testing & others

An alternative way to do this is by making the use of callback EarlyStopping which is used for monitoring of metrics of validation. At the same time, if none of the improvement is there, we can even go to stop the training.

Implement early PyTorch early stopping

In the process of enabling the EarlyStopping callback we will have to perform the following steps –

  • EarlyStopping callback should be imported at the top of the program.
  • By using the method log() we can keep the logs and monitoring of the required metrics.
  • The next step is the initialization of callback and further, go for setting any of the metrics that is logged according to our choice to the monitor.
  • EarlyStopping callback should be passed to the callbacks flag named Trainer.

Some of the other parameters that are required for stopping the process of training at some of the points that are extreme in nature are as follows –

  • Divergence threshold – If the quantity that is monitored reaches a value that is even worse than the specified value then the training is immediately stopped. If this kind of condition arises that we are sure there can be none of the recoveries of the model then we should better way opt for stopping the model early by specifying the conditions at the beginning that are different.
  • Stopping threshold – When the value of the monitored quantity reaches the value of threshold then we can use this parameter to stop the training immediately. The most important scenario is when we are already aware that if the value reaches beyond the particular value, then there will not be any benefit to us.
  • Check infinite – If the value of the metric that is monitored has the value infinite or NAN when we make this parameter on.

PyTorch early stopping examples

Let us consider one sample example where we will try to write the program for the recognition of handwritten digits in the simple mnist format –

# UTF-8 standard coding pattern
import torch
import torch.nn as neuralNetwork
import torch.optim as optimizer
import torch.utils.data as sampleData
from torchvision import educbaSetOfData, transforms
# Designing the architecture of model
class Net(neuralNetwork.Module):
def __init__(selfParam):
super(Net, selfParam).__init__()
selfParam.main = neuralNetwork.Sequential(
neuralNetwork.Linear(inputFeatures = 784, outputFeatures = 128),
neuralNetwork.ReLU(),
neuralNetwork.Linear(inputFeatures = 128, outputFeatures = 64),
neuralNetwork.ReLU(),
neuralNetwork.Linear(inputFeatures = 64, outputFeatures = 10),
neuralNetwork.LogSoftmax(dim = 1)
)
def forward(selfParam, input):
return selfParam.main(input)
# Training the model
def train(device, model, epochs, optimizerizer, functionForLossCalculation, loaderForTraining):
for epoch in range(1, epochs+1):
for times, sampleData in enumerate(loaderForTraining, 1):
inputs = sampleData[0].to(device)
labels = sampleData[1].to(device)
# Set the value of gradints to zero
optimizerizer.zero_grad()
# Progation is being done in the backward and forward manner
outputs = model(inputs.view(inputs.shape[0], -1))
loss = functionForLossCalculation(outputs, labels)
loss.backward()
optimizerizer.step()
# Displaying the progress of model
if times % 100 = = 0 or times = = len(loaderForTraining):
print('[{}/{}, {}/{}] loss: {:.8}'.format(epoch, epochs, times, len(loaderForTraining), loss.item()))
return model
def test(device, model, loaderUsedForTestingPurpose):
# set the values for settings of model
model.eval()
acquiredTotal = 0
correctnessOfModel = 0
with torch.no_grad():
for sampleData in loaderUsedForTestingPurpose:
inputs = sampleData[0].to(device)
labels = sampleData[1].to(device)
outputs = model(inputs.view(inputs.shape[0], -1))
_, predicted = torch.max(outputs.sampleData, 1)
total + = labels.size(0)
correctnessOfModel + = (predicted = = labels).sum().item()
print('Achieved Accuracy:', correctnessOfModel / total)
def main():
# defining the values related to GPU device
device = 'cuda:0' if torch.cuda.is_available() else 'cpu'
print('State Of Device:', device)
# set the values for settings of model
epochs = 100
sizeOfBatch = 64
lr = 0.002
functionForLossCalculation = neuralNetwork.NLLLoss()
model = Net().to(device)
optimizerizer = optimizer.Adam(model.parameters(), lr = lr)
# transform the created model
sampleObjectOftransform = transforms.Compose(
[transforms.ToTensor(),
transforms.Normalize((0.5,), (0.5,))] )
# Prepare the dataset for training
setOfValuesForTraining = educbaSetOfData.MNIST(root = 'MNIST', download = True, train = True, transform = sampleObjectOftransform )
setForTesting = educbaSetOfData.MNIST(root = 'MNIST', download = True, train = False, transform = sampleObjectOftransform )
loaderForTraining = sampleData.DataLoader(setOfValuesForTraining, batch_size = sizeOfBatch, shuffle = True)
loaderUsedForTestingPurpose = sampleData.DataLoader(setForTesting, batch_size = sizeOfBatch, shuffle = False)
# Training of the model
sampleEducbaModel = train(device, model, epochs, optimizerizer, functionForLossCalculation, loaderForTraining)
# Testing the working of model
test(device, sampleEducbaModel, loaderUsedForTestingPurpose)
if __name__ = = '__main__':
main()

The output of the execution of the above code is as shown in the below image –

1

Conclusion

While training the neural network model, we can avoid the overfitting of data it by using early stopping where try to observe the loss of validation and training plots and accordingly decide on the basis of divergence in the value whether to terminate the training or not.

Recommended Articles

This is a guide to PyTorch Early Stopping. Here we discuss the Introduction, overviews, How to use PyTorch early stopping, examples with code implementation respectively. You may also have a look at the following articles to learn more –

  1. Dataset Pytorch
  2. PyTorch Conv2d
  3. What is PyTorch?
  4. PyTorch vs Keras
Popular Course in this category
Machine Learning Training (20 Courses, 29+ Projects)
  19 Online Courses |  29 Hands-on Projects |  178+ Hours |  Verifiable Certificate of Completion
4.7
Price

View Course
0 Shares
Share
Tweet
Share
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Java Tutorials
  • Python Tutorials
  • All Tutorials
Certification Courses
  • All Courses
  • Software Development Course - All in One Bundle
  • Become a Python Developer
  • Java Course
  • Become a Selenium Automation Tester
  • Become an IoT Developer
  • ASP.NET Course
  • VB.NET Course
  • PHP Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Software Development Course

C# Programming, Conditional Constructs, Loops, Arrays, OOPS Concept

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Software Development Course

Web development, programming languages, Software testing & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more