Ensemble methods combine multiple machine learning models to achieve better predictive performance than any single model alone. By aggregating diverse predictions, ensembles reduce errors and improve generalization. This section covers major ensemble strategies including bagging, boosting, and stacking, along with popular algorithms like Random Forest and XGBoost