December 25, 2016

[Skip to the bottom if you just want the coupon]

This course is all about **ensemble methods**.

We’ve already learned some classic machine learning models like **k-nearest neighbor** and **decision tree**. We’ve studied their limitations and drawbacks.

But what if we could combine these models to eliminate those limitations and produce a much more powerful classifier or regressor?

In this course you’ll study ways to combine models like decision trees and logistic regression to build models that can reach much higher accuracies than the base models they are made of.

In particular, we will study the **Random Forest** and **AdaBoost** algorithms in detail.

To motivate our discussion, we will learn about an important topic in statistical learning, the **bias-variance trade-off**. We will then study the **bootstrap** technique and **bagging** as methods for reducing both bias and variance simultaneously.

We’ll do plenty of **experiments** and use these algorithms on **real datasets** so you can see first-hand how powerful they are.

Since **deep learning** is so popular these days, we will study some interesting commonalities between random forests, AdaBoost, and deep learning neural networks.

https://www.udemy.com/machine-learning-in-python-random-forest-adaboost/?couponCode=EARLYBIRDSITE2