Introduction by Zhiming:
For the next week’s reading group, I propose that we have a topic about Ensemble Learning Methods, and there will be two slides for these reading group. Hoping through this reading group, everyone has a basic idea about ensemble learning and knows a new method called Gradient Boosting Trees.
- Ensembles of Learners: Slides. Videos.
- Introduction to Boosted Trees
- Gradient Boosted Regression Trees in scikit-learn (optional)
Bagging does not weight the results. AdaBoost does, and is good when you have a lot of features that you want to use stump decisions. Random forests have a bias / variance tradeoff.