Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Basic understanding of the Adaboost algorithm

I'm a machine learning newbie trying to understand how Adaboost works.

I've read many articles explaining how Adaboost makes use of set of weak *classifiers* to create a strong classifier.

However, I seem to have problem understanding the statement that "Adaboost creates a Strong Classifier".

When I looked at implementations of Adaboost, I've realized that it doesn't "actually" create a Strong Classifier but somehow in the TESTING PHASE figures out on "how to use set of Weak Classifiers to get more accurate results" which in turn acts like a strong classifier "Collectively".

So technically there is NO SINGLE STRONG CLASSIFIER created (but set of weak classifiers collectively act as a strong classifier).

Please correct me if I'm wrong. It would be nice if someone can throw in some comments regarding this.

like image 323
garak Avatar asked Apr 07 '12 00:04

garak


People also ask

What is meant by AdaBoost?

AdaBoost, short for Adaptive Boosting, is a statistical classification meta-algorithm formulated by Yoav Freund and Robert Schapire in 1995, who won the 2003 Gödel Prize for their work. It can be used in conjunction with many other types of learning algorithms to improve performance.

What is AdaBoost learning?

AdaBoost is an ensemble learning method (also known as “meta-learning”) which was initially created to increase the efficiency of binary classifiers. AdaBoost uses an iterative approach to learn from the mistakes of weak classifiers, and turn them into strong ones.

What is AdaBoost classification?

An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset but where the weights of incorrectly classified instances are adjusted such that subsequent classifiers focus more on difficult cases.

What are the advantages of the AdaBoost algorithm?

Coming to the advantages, Adaboost is less prone to overfitting as the input parameters are not jointly optimized. The accuracy of weak classifiers can be improved by using Adaboost. Nowadays, Adaboost is being used to classify text and images rather than binary classification problems.


Video Answer


1 Answers

A classifier is a black box that receives an input (feature vectors) and returns an output (labeled vectors). So to call something a classifier, you only care about what it does, and not how it does it. AdaBoost's classifier can be seen as such black box, so it's indeed a single classifier, even if it uses internally several weak classifiers to produce such output.

like image 147
Diego Avatar answered Sep 28 '22 16:09

Diego