EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 360+ Courses All in One Bundle
  • Login
Home Data Science Data Science Tutorials Machine Learning Tutorial Backward Elimination
Secondary Sidebar
Machine Learning Tutorial
  • Supervised
    • What is Supervised Learning
    • Supervised Machine Learning
    • Supervised Machine Learning Algorithms
    • Perceptron Learning Algorithm
    • Simple Linear Regression
    • Polynomial Regression
    • Multivariate Regression
    • Regression in Machine Learning
    • Hierarchical Clustering Analysis
    • Linear Regression Analysis
    • Support Vector Regression
    • Multiple Linear Regression
    • Linear Algebra in Machine Learning
    • Statistics for Machine Learning
    • What is Regression Analysis?
    • Clustering Methods
    • Backward Elimination
    • Ensemble Techniques
    • Bagging and Boosting
    • Linear Regression Modeling
    • What is Reinforcement Learning
  • Basic
    • Introduction To Machine Learning
    • What is Machine Learning?
    • Uses of Machine Learning
    • Applications of Machine Learning
    • Naive Bayes in Machine Learning
    • Dataset Labelling
    • DataSet Example
    • Deep Learning Techniques
    • Dataset ZFS
    • Careers in Machine Learning
    • What is Machine Cycle?
    • Machine Learning Feature
    • Machine Learning Programming Languages
    • What is Kernel in Machine Learning
    • Machine Learning Tools
    • Machine Learning Models
    • Machine Learning Platform
    • Machine Learning Libraries
    • Machine Learning Life Cycle
    • Machine Learning System
    • Machine Learning Datasets
    • Machine Learning Certifications
    • Machine Learning Python vs R
    • Optimization for Machine Learning
    • Types of Machine Learning
    • Machine Learning Methods
    • Machine Learning Software
    • Machine Learning Techniques
    • Machine Learning Feature Selection
    • Ensemble Methods in Machine Learning
    • Support Vector Machine in Machine Learning
    • Decision Making Techniques
    • Restricted Boltzmann Machine
    • Regularization Machine Learning
    • What is Regression?
    • What is Linear Regression?
    • Dataset for Linear Regression
    • Decision tree limitations
    • What is Decision Tree?
    • What is Random Forest
  • Algorithms
    • Machine Learning Algorithms
    • Apriori Algorithm in Machine Learning
    • Types of Machine Learning Algorithms
    • Bayes Theorem
    • AdaBoost Algorithm
    • Classification Algorithms
    • Clustering Algorithm
    • Gradient Boosting Algorithm
    • Gradient Descent in Machine Learning
    • Mean Shift Algorithm
    • Hierarchical Clustering Algorithm
    • Hierarchical Clustering Agglomerative
    • What is a Greedy Algorithm?
    • What is Genetic Algorithm?
    • Random Forest Algorithm
    • Nearest Neighbors Algorithm
    • Weak Law of Large Numbers
    • Ray Tracing Algorithm
    • SVM Algorithm
    • Naive Bayes Algorithm
    • Neural Network Algorithms
    • Boosting Algorithm
    • XGBoost Algorithm
    • Pattern Searching
    • Loss Functions in Machine Learning
    • Decision Tree in Machine Learning
    • Hyperparameter Machine Learning
    • Unsupervised Machine Learning
    • K- Means Clustering Algorithm
    • KNN Algorithm
    • Monty Hall Problem
  • Classification
    • Kernel Methods in Machine Learning
    • Clustering in Machine Learning
    • Machine Learning Architecture
    • Automation Anywhere Architecture
    • Machine Learning C++ Library
    • Machine Learning Frameworks
    • Data Preprocessing in Machine Learning
    • Data Science Machine Learning
    • Classification of Neural Network
    • Neural Network Machine Learning
    • What is Convolutional Neural Network?
    • Single Layer Neural Network
    • Kernel Methods
    • Forward and Backward Chaining
    • Forward Chaining
    • Backward Chaining
  • Deep Learning
    • What Is Deep learning
    • Overviews Deep Learning
    • Application of Deep Learning
    • Careers in Deep Learnings
    • Deep Learning Frameworks
    • Deep Learning Model
    • Deep Learning Algorithms
    • Deep Learning Technique
    • Deep Learning Networks
    • Deep Learning Libraries
    • Deep Learning Toolbox
    • Types of Neural Networks
    • Convolutional Neural Networks
    • Create Decision Tree
    • Deep Learning for NLP
    • Caffe Deep Learning
    • Deep Learning with TensorFlow
  • RPA
    • What is RPA
    • What is Robotics?
    • Benefits of RPA
    • RPA Applications
    • Types of Robots
    • RPA Tools
    • Line Follower Robot
    • What is Blue Prism?
    • RPA vs BPM
  • Interview Questions
    • Deep Learning Interview Questions And Answer
    • Machine Learning Cheat Sheet

Backward Elimination

By Priya PedamkarPriya Pedamkar

Backward Elimination

Introduction to Backward Elimination

As man and machine knocking towards digital evolution, diverse technique machines are accounting to get themselves not only trained but smartly trained to come out with better recognition of real-world objects. Such a technique introduced earlier called “Backward Elimination” that intended to favor indispensable features while eradicating nugatory features to indulge better optimization in a machine. The entire proficiency of object recognition by Machine is proportionate to what features it considering.

Features that have no reference on the predicted output must be discharged out from the machine and it is concluded by backward elimination. Good Precision and time complexity of recognition of any real word object by Machine depend on its learning. So backward eliminate plays its rigid role for feature selection. It reckons the rate of dependency of features to the dependent variable finds the significance of its belonging in the model. To accredit this, it checks the reckoned rate with a standard significance level (say 0.06) and takes a decision for feature selection.

Start Your Free Data Science Course

Hadoop, Data Science, Statistics & others

Why Do We Entail Backward Elimination?

Unessential and redundant traits propel the complexity of machine logic. It devours time and model’s resources unnecessarily. So the Aforementioned technique plays a competent role to forge the model to simple. The algorithm cultivates the best version of the model by optimizing its performance and truncating its expendable appointed resources.

It curtails the least noteworthy features from the model which causes noise in deciding the line of regression. Irrelevant object traits may engender to misclassification and prediction. Irrelevant features of an entity may constitute a misbalance in the model with respect to other significant features of other objects. The backward elimination fosters the fitting of the model to the best case. Hence, backward elimination is recommended to use in a model.

How to Apply Backward Elimination?

The backward elimination commences with all feature variables, testing it with the dependent variable under a selected fitting of model criterion. It starts eradicating those variables which deteriorate the fitting line of regression. Repeating this deletion until the model attains a good fit. Below are the steps to practice the backward elimination:

Step 1: Choose the appropriate level of significance to reside in the model of the machine. (Take S=0.06)

Step 2: Fed all the available independent variables to the model with respect to the dependent variable and computer the slope and intercept to draw a line of regression or fitting line.

Step 3: Traverse with all the independent variable which possess the highest value (Take I) one by one and proceed with the following toast:-

a) If I > S, execute the 4th step.
b) Else abort and the model is perfect.

Step 4: Remove that chosen variable and increment the traversal.

Step 5: Re-forge the model again and compute the slope and intercept of the fitting line again with residual variables.

The aforementioned steps are summarized into the rejection of those features whose significance rate is above to the selected significance value (0.06) to evade over-classification and overutilization of resources which observed as high complexity.

Merits and Demerits of Backward Elimination

Here are some merits and demerits of backward elimination given below in detail:

backward elimination

1. Merits

Merits of backward elimination are as follows:

  • Speedy Training: The machine is trained with a set of available features of pattern which is done in a very short time if unessential features are removed from the model. Speedy training of data set comes into picture only when the model is dealing with significant features and excluding all the noise variables. It draws a simple complexity for training. But the model should not undergo under-fitting which occurs due to lack of features or inadequate samples. The sample feature should be plentiful in a model for the best classification. The time requires to train the model should be less while maintaining classification accuracy and left with no under-prediction variable.
  • Lower Complexity: The complexity of the model happens to be high if the model contemplates the extent of features including noise and unrelated features. The model consumes much space and time to process such a span of features. This may increase the rate of accuracy of pattern recognition, but the rate may contain noise as well. To get rid of such a high complexity of the model, the backward elimination algorithm plays a requisite role by retrenching the unwanted features from the model. It simplifies the processing logic of the model. Only a few essential features are ample to draw a good fit which contenting reasonable accuracy.
  • Improve Performance: The model performance depends on many aspects. The model undergoes optimization by using backward elimination. The optimization of the model is the optimization of the dataset used for training the model. The model’s performance is directly proportional to its rate of optimization which relies on the frequency of significant data. The backward elimination process is not intended to starts alteration from any low-frequency predicator. But it only starts alteration from high-frequency data because mostly the model’s complexity depends over that part.
  • Circumvent the Over-Fitting: The over-fitting situation occurs when the model got too many datasets and classification or prediction is conducted in which some predicators got the noise of other classes. In this fitting, the model supposed to give unexpectedly high accuracy. In over-fitting, the model may fail to classify the variable because of confusion created in logic due to too many conditions. The backward elimination technique curtails out the extraneous feature to circumvent the situation of over-fitting.

2. Demerits

Demerits of backward elimination are as follows:

  • In the backward elimination method, one cannot find out which predicator is responsible for the rejection of another predicator due to its reaching to insignificance. For instance, if predicator X has some significance which was good enough to reside in a model after adding the Y predicator. But the significance of X gets outdated when another predicator Z comes into the model. So the backward elimination algorithm does not evident about any dependency between two predictors which happen in the “Forward selection technique”.
  • After discarding any feature from a model by a backward elimination algorithm, that feature cannot be selected again. In short, backward elimination does not have a flexible approach to add or remove features/predictors.
  • The norms to select the significance value (0.06) in the model are inflexible. The backward elimination does not have a flexible procedure to not only choose but also change the insignificant value as required in order to fetch the best fit under an adequate dataset.

Conclusion

The Backward elimination technique realized to ameliorate the model’s performance and to optimize its complexity. It vividly used in multiple regressions where the model deals with the extensive dataset. It is an easy and simple approach as compare to forward selection and cross-validation in which overload of optimization encountered. The backward elimination technique initiates the elimination of features of higher significance value. Its basic objective is to make the model less complex and forbid over-fitting situation.

Recommended Articles

This is a guide to Backward Elimination. Here we discuss how to apply backward elimination along with the merits and demerits. You may also look at the following articles to learn more-

  1. Hyperparameter Machine Learning
  2. Clustering in Machine Learning
  3. Java Virtual Machine
  4. Unsupervised Machine Learning
  5. Various Pattern Recognition Applications
Popular Course in this category
Machine Learning Training (20 Courses, 29+ Projects)
  19 Online Courses |  29 Hands-on Projects |  178+ Hours |  Verifiable Certificate of Completion
4.7
Price

View Course

Related Courses

Deep Learning Training (18 Courses, 24+ Projects)4.9
Artificial Intelligence AI Training (5 Courses, 2 Project)4.8
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Database Management
  • Machine Learning
  • All Tutorials
Certification Courses
  • All Courses
  • Data Science Course - All in One Bundle
  • Machine Learning Course
  • Hadoop Certification Training
  • Cloud Computing Training Course
  • R Programming Course
  • AWS Training Course
  • SAS Training Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2023 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you
EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you
EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more