Ensemble Learning Methods

Introduction by Zhiming:

For the next week’s reading group, I propose that we have a topic about Ensemble Learning Methods, and there will be two slides for these reading group. Hoping through this reading group, everyone has a basic idea about ensemble learning and knows a new method called Gradient Boosting Trees.

Required Reading:

  1. Ensembles of Learners: Slides. Videos.
  2. Introduction to Boosted Trees
  3. Gradient Boosted Regression Trees in scikit-learn (optional)

Take-home message:

Bagging does not weight the results. AdaBoost does, and is good when you have a lot of features that you want to use stump decisions. Random forests have a bias / variance tradeoff.

Leave a Reply

Your email address will not be published. Required fields are marked *