Ensemble Learning - Machine Learning
Understanding Ensemble Learning
- Ensemble Learning is the Combining multiple models results to get better results is a major key idea behind the Ensemble Learning, call it as Experts Opinion.
- It has being used in several Machine Learning Competitions. EX: NetFlix, KDD Cup Competition for Data Mining.
- We can build Ensembles Model by using the same models several times with different parameters settings (Homogeneous Ensembles) or different Models (Heterogeneous Ensembles).
- This approach is more useful when we have to work with less data.
- It not only improves the performance but also reduces overfitting by randomly drawing the samples to create different training datasets.
- Ensemble Learning performs better than individual learners i.e. error rate of the Ensemble Learning is less than average of individual learners.
- We can Combining 100 or 1000 models but it is not intuitive to explain what are the key contributing things in improving performance.
- There are 3 different ways to build Ensemble Learning
- Parallel Learning (Bagging)
- Sequential Learning (Boosting)
- Stacking (Meta Learning)
