Characteristics of Boosting in Machine Learning
Boosting in Machine Learning:
‘Boosting‘ refers to a family of algorithm which converts weak learner to strong learners. It is an iterative technique which adjust the weight of an observation based on the last classification. If an observation was classified incorrectly, it tries to increase the weight of this observation and vice-versa.
Boosting is a supervised machine learning strategy that combines the predictions of multiple weak models to generate a powerful ensemble model. Boosting in general decreases the bias error and builds strong predictive models. However, they may sometimes over fit on the training data.
Characteristics of Boosting:
1. Operates via weighted voting.
2. Algorithm proceeds iteratively, new models are influenced by previous ones.
3. New models become experts for instances classified incorrectly by earlier models.
4. It can used without weights by using re-sampling with probability determined by weights.
5. It works well if classifiers are not too complex.
6. Also works well with weak learners.
7. AdaBoost is a popular boosting algorithm.
8. LogitBoost is another which uses additive logistic regression, and handles multi-class problems.