Machine Learning is the scientific study of algorithms to perform calculation, data processing, automated reasoning and other tasks.

Machine Learning Ensemble Methods use multiple learning algorithms to obtain better predictive performance.

In this case, Boosting is a Machine Learning Ensemble Meta Algorithm. Boosting wants to make a set of weak learners create a single strong learner.

A weak learner is defined to be a classifier that is only slightly correlated with the true classification and a strong learner is a classifier that is arbitrarily well-correlated with the true classification.

Related Course:
Deep Learning for Computer Vision with Tensor Flow and Keras
icon

One of the different types of boosting algorithms is AdaBoost.

How does AdaBoost work?

AdaBoost short for Adaptive Boosting, was formulated by computer scientists Yoav Freund and Robert Schapire, winners of the 2003 Gödel Prize for their work.

It combines weak classifier algorithm to form strong classifier. If a single algorithm classifies the objects poorly, we can combine multiple classifiers to improve them and achieve accuracy.

Explaining AdaBoost can be something like this:

  • Step 1. Select a learning algorithm.

  • Step 2. Apply it on your data. If data have 100 samples, some of the class labels can be wrong. Let’s say 90 correct and 10 wrong.

  • Step 3. Resample the instances in training set so that your learning algorithm now has an error rate of 50%.

To do it we the data we’re talking about you copy the ones that are wrong 9 times for the next iteration. Data set is 90 correct and 90 incorrect.

  • Step 4. Apply the step 3 data to step 2. Then, repeat this process, each time producing a new classifier.

  • Step 5. Stop when you have done this enough times or you get 100% accuracy. This will give you different classifiers.

  • Step 6. To classify, vote across all of the learning algorithms you built in steps 2 to 4.

Machine Learning model is equally important as the accuracy of the model.

It is a linear combination of weak learners and if the model failed at some point, weak learners are there to blame.

Programmers use AdaBoost because it allows to manually tuning the models to improve performance.