Difference between Bagging and Boosting and Stacking
Bagging:
Bagging is an ensemble learning technique that combines the benefits of bootstrapping and aggregation to yield a stable model and improve the prediction performance of a machine learning model. It also known as Bootstrap Aggregation.
Boosting:
Boosting is an ensemble learning technique that combines the predictions of weak learners sequentially. Each new model is trained on a weighted training set. We assign weights based on the errors of the previous models in the sequence.
Stacking:
In stacking, the predictions of base models are fed as input to a meta-model. The job of the meta-model is to take the predictions of the base models and make a final prediction: It also known as Stacked Generalization.
Bagging vs Boosting vs Stacking:
Parameters | Bagging | Boosting | Stacking |
---|---|---|---|
Partitioning of the data into subsets | Random | Giving mis-classified samples higher preference | Various |
Goal to achieve | Minimize variance | Increase predictive force | Both |
Methods where it is used | Random subspace | Gradient descent | Blending |
Function to combine single models | Weighted average | Weighted majority vote | Logistic regression |