EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 600+ Courses All in One Bundle
  • Login
Home Software Development Software Development Tutorials PyTorch Tutorial PyTorch MNIST
Secondary Sidebar
PyTorch Tutorial
  • PyTorch
    • PyTorch Image Classification
    • PyTorch Random
    • PyTorch Variable
    • PyTorch Activation Function
    • Python Formatted String
    • PyTorch GPU
    • PyTorch CUDA
    • PyTorch DataLoader
    • PyTorch LSTM
    • PyTorch Pad
    • PyTorch OpenCL
    • PyTorch Lightning
    • PyTorch SoftMax
    • PyTorch Flatten
    • PyTorch gan
    • PyTorch max
    • PyTorch pip
    • PyTorch Parameter
    • PyTorch Load Model
    • PyTorch Distributed
    • PyTorch BERT
    • PyTorch interpolate
    • PyTorch JIT
    • PyTorch expand
    • PyTorch AMD
    • PyTorch GRU
    • PyTorch rnn
    • PyTorch permute
    • PyTorch argmax
    • PyTorch SGD
    • PyTorch nn
    • PyTorch One Hot Encoding
    • PyTorch Tensors
    • What is PyTorch?
    • PyTorch MSELoss()
    • PyTorch NLLLOSS
    • PyTorch MaxPool2d
    • PyTorch Pretrained Models
    • PyTorch Squeeze
    • PyTorch Reinforcement Learning
    • PyTorch zero_grad
    • PyTorch norm
    • PyTorch VAE
    • PyTorch Early Stopping
    • PyTorch requires_grad
    • PyTorch MNIST
    • PyTorch Conv2d
    • Dataset Pytorch
    • PyTorch tanh
    • PyTorch bmm
    • PyTorch profiler
    • PyTorch unsqueeze
    • PyTorch adam
    • PyTorch backward
    • PyTorch concatenate
    • PyTorch Embedding
    • PyTorch Tensor to NumPy
    • PyTorch Normalize
    • PyTorch ReLU
    • PyTorch Autograd
    • PyTorch Transpose
    • PyTorch Object Detection
    • PyTorch Autoencoder
    • PyTorch Loss
    • PyTorch repeat
    • PyTorch gather
    • PyTorch sequential
    • PyTorch U-NET
    • PyTorch Sigmoid
    • PyTorch Neural Network
    • PyTorch Quantization
    • PyTorch Ignite
    • PyTorch Versions
    • PyTorch TensorBoard
    • PyTorch Dropout
    • PyTorch Model
    • PyTorch optimizer
    • PyTorch ResNet
    • PyTorch CNN
    • PyTorch Detach
    • Single Layer Perceptron
    • PyTorch vs Keras
    • torch.nn Module

PyTorch MNIST

PyTorch MNIST

Introduction to PyTorch MNIST

The following article provides an outline for PyTorch MNIST. The Modified National Institute of Standards and Technology database or MNIST has all the useful details corresponding to image processing systems in various use cases. This database is helpful in many cases of machine learning, and deep learning as image processing details can be fetched directly from the database without any difficulties. In addition, we have digits from 0 to 9 where a baseline is available to test all the image processing systems.

What is PyTorch MNIST?

  • The database is a reliable source for all data scientists as it is considered as the welcome note of machine learning.
  • If any new architecture or framework is built, data scientists can train the algorithm on the MNIST to check whether the framework is working fine.
  • We have a training and test dataset in MNIST with 60000 and 10000 examples, respectively, in each dataset.
  • The images present inside the dataset are of the same size where the digits are present and normalized.

PyTorch MNIST Model

We are downloading MNIST dataset and using it in the PyTorch model.

Code:

from torchvision import datasets
from torchvision.transforms import ToTensor
train_dataset = datasets.MNIST(
root = 'datasets',
train = True,
transform = ToTensor(),
download = True,
)
test_dataset = datasets.MNIST(
root = 'datasets',
train = False,
transform = ToTensor()
)
print(train_dataset)
print(test_dataset)
print(train_dataset.data.size())
print(test_dataset.data.size())
import matplotlib.pyplot as plot
plot.imshow(train_dataset.data[0], cmap='gray')
plot.title('%i' % train_dataset.targets[0])
plot.show()
images = plot.figure(figsize=(15, 10))
columns, rows = 7, 7
for p in range(2, columns * rows + 1):
sample_idx = torch.randint(len(train_dataset), size=(1,)).item()
img, labels = train_dataset[sample_idx] figure.add_subplot(rows, columns, p)
plot.title(label)
plot.axis("off")
plot.imshow(img.squeeze(), cmap="gray")
plot.show()
class CNNetwork(nn.Module):
def __init__(self):
super(CNNetwork, self).__init__()
self.conv1 = nn.Sequential(
nn.Conv2d(
input_channels=2,
output_channels=8,
kernel_size=4,
stride=2,
padding=4,
),
nn.ReLU(),
nn.MaxPool2d(kernel_size=4),
)
self.conv2 = nn.Sequential(
nn.Conv2d(26, 44, 4, 2, 3),
nn.ReLU(),
nn.MaxPool2d(5),
)
self.output = nn.Linear(44 * 8 * 8, 5)
def forward(self, a):
a = self.conv1(a)
a = self.conv2(a)
a = a.view(a.size(0), -1)
result = self.out(a)
return result, a
from torch.autograd import Variable
number_epochs = 8
def train_dataset(number_epochs, cnn, loaders):
cnn.train_dataset()
total_step = len(loaders['train_dataset'])
for epoch in range(number_epochs):
for p, (images, labels) in enumerate(loaders['train_dataset']):
w_x = Variable(images)
w_y = Variable(labels)
output = cnn(w_x)[0] loss = loss_func(output, w_y)
optimizer.zero_grad()
loss.backward()
optimizer.step()
if (p+1) % 100 == 0:
print ('Epoch'
.format(epoch + 1, number_epochs, i + 1, total_step, loss.item()))
pass
pass
pass
train(number_epochs, cnn, loaders)

Start Your Free Software Development Course

Web development, programming languages, Software testing & others

Prerequisites

  • We must install Pytorch binaries when we have the system with NVIDIA GPU. All the inbuilt binaries are supported in PyTorch libraries, and hence this will be a handful to start with PyTorch. If needed, a simple program in CUDA will explain whether the import of PyTorch is working with the system. If we do not have NVIDIA GPU, then CUDA installation is needed before importing the PyTorch module for which documentation are provided.
  • Now to start with MNIST, we must install the determined cluster to start with. Multiple GPU servers can be used for on-premise deployments where we can start the cluster with a single command. Now, we have to import the model in PyTorch to MNIST dataset so that we can check the architecture is working well. A built-in training loop is present inside the module where we can use batches of data into the forward pass to do all the calculations.
  • The next steps to perform are as follows: initializing the code, building the model, followed by optimizer definition, and defining the forward pass. The final step is to load the training dataset and validate the same.

Using PyTorch on MNIST Dataset

  • It is easy to use PyTorch in MNIST dataset for all the neural networks. DataLoader module is needed with which we can implement a neural network, and we can see the input and hidden layers. Activation functions need to be applied with loss and optimizer functions so that we can implement the training loop. Now we can see the model and test it for accuracy of the model.
  • All the parameters for the model must be defined first after importing the needed libraries. The next step is to load the MNIST dataset and dataloader, where we can specify the same batch size. Then, since we have hidden layers in the network, we must use the ReLu activation function and the PyTorch neural network module. Finally, we must look for a feed-forward method in the dataset and apply the changes to the layers.
  • Softmax is not needed here as cross-entropy will function automatically to all the layers. After setting the loss and optimizer function in the dataset, a training loop must be created. All the images required for processing are reshaped so that input size and loss are calculated easily. We can do the final testing now, and gradients need not be computed here.

Example of PyTorch MNIST

Given below is the example mentioned:

All in One Software Development Bundle(600+ Courses, 50+ projects)
Python TutorialC SharpJavaJavaScript
C Plus PlusSoftware TestingSQLKali Linux
Price
View Courses
600+ Online Courses | 50+ projects | 3000+ Hours | Verifiable Certificates | Lifetime Access
4.6 (86,560 ratings)

The first step is to set up the environment by importing a torch and torchvision.

Code:

import torch
import torchvision
import matplotlib.pyplot as plot
num_epochs = 5
train_size_batch = 32
test_size_batch = 5000
lr_rate = 0.05
momentum = 0.75
log_intervals = 5
seeds = 2
torch.backends_enabled = False
torch.manual_seed(seeds)
train_load = torch.utils.data.DataLoader(
torchvision.datasets.MNIST('/filesaved/', train=True, download=True,
transform=torchvision.transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1567,), (0.3791,))
])),
batch = train_size_batch, shuffle=True)
test_load = torch.utils.data.DataLoader(
torchvision.datasets.MNIST('/filesave/', train=False, download=True,
transform=torchvision.transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1567,), (0.3791,))
])),
batch_size= test_size_batch, shuffle=True)
example_dataset = enumerate(test_load)
batch_idx, (example_log, example_results) = next(example_dataset)
example_log.shape
images = plot.figure()
for x in range(5):
plot.subplot(2,2,x+1)
plot.tight_layout()
plot.imshow(example_data[x][0], cmap='blue', interpolation='none')
plot.title("Facts: {}".format(example_results[x]))
plot.xticks([])
plot.yticks([])
images
import torch.nn as netnn
import torch.nn.functional as Fun
import torch.optim as optimnet
class Network(netn.Module):
def __init__(self):
super(Network, self).__init__()
self.conv1 = netn.Conv2d(1, 20, kernel_size=10)
self.conv2 = nn.Conv2d(10, 40, kernel_size=10)
self.conv2_drop = nn.Dropout2d()
self.fc1 = nn.Linear(390, 70)
self.fc2 = nn.Linear(60, 20)
def forwardnetwork(self, a):
a = Fun.relu(Fun.max_pool2d(self.conv1(a), 2))
a = Fun.relu(Fun.max_pool2d(self.conv2_drop(self.conv2(a)), 2))
a = a.view(-1, 320)
a = Fun.relu(self.fc1(a))
a = Fun.dropout(a, training=self.training)
a = self.fc2(a)
return Fun.log_softmax(a)

Conclusion

We can use MNIST in supervised learning where classifiers can be trained. This dataset, along with the machine learning dataset, helps data scientists in many aspects to discover different modes of training and give a broad description of the data being used in the dataset. A labelled dataset is preferred in these cases.

Recommended Articles

This is a guide to PyTorch MNIST. Here we discuss the introduction, PyTorch MNIST model, prerequisites, and example, respectively. You may also have a look at the following articles to learn more –

  1. PyTorch Versions
  2. torch.nn Module
  3. Tensorflow Basics
  4. Introduction to Tensorflow
Popular Course in this category
Machine Learning Training (20 Courses, 29+ Projects)
  19 Online Courses |  29 Hands-on Projects |  178+ Hours |  Verifiable Certificate of Completion
4.7
Price

View Course
0 Shares
Share
Tweet
Share
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Java Tutorials
  • Python Tutorials
  • All Tutorials
Certification Courses
  • All Courses
  • Software Development Course - All in One Bundle
  • Become a Python Developer
  • Java Course
  • Become a Selenium Automation Tester
  • Become an IoT Developer
  • ASP.NET Course
  • VB.NET Course
  • PHP Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Software Development Course

C# Programming, Conditional Constructs, Loops, Arrays, OOPS Concept

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Software Development Course

Web development, programming languages, Software testing & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more