Difference between Bagging and Boosting and Stacking


Bagging is an ensemble learning technique that combines the benefits of bootstrapping and aggregation to yield a stable model and improve the prediction performance of a machine learning model. It also known as Bootstrap Aggregation.


Boosting is an ensemble learning technique that combines the predictions of weak learners sequentially. Each new model is trained on a weighted training set. We assign weights based on the errors of the previous models in the sequence.


In stacking, the predictions of base models are fed as input to a meta-model. The job of the meta-model is to take the predictions of the base models and make a final prediction: It also known as Stacked Generalization.

Bagging vs Boosting vs Stacking:

Partitioning of the data into subsetsRandomGiving mis-classified samples higher preferenceVarious
Goal to achieveMinimize varianceIncrease predictive forceBoth
Methods where it is usedRandom subspaceGradient descentBlending
Function to combine single modelsWeighted averageWeighted majority voteLogistic regression