[New Release] Machine Learning and AI: Support Vector Machines in Python

Support Vector Machines in Python

 

Wow, I didn’t think I’d be coming out with another course so soon – but here it is!

[if you don’t want to read my little spiel just click here to get your VIP coupon: https://deeplearningcourses.com/c/support-vector-machines-in-python]

[By the way, I went all-out this time in the VIP version – you’ll want to check it out below – comes with 4 all-new models (both theory+code provided of course)]

SVMs are one of the most robust and powerful machine learning models. It can be a very useful “plug-and-play” solution – just throw your data in the model and wait for the magic to happen.

Unlike deep learning, where you can spend days or weeks tuning your hyperparameters, SVMs only have 2 hyperparameters, which are generally easy to understand and reason about.

One of the things you’ll learn about in this course is that a support vector machine actually is a neural network, and they essentially look identical if you were to draw a diagram.


The toughest obstacle to overcome when you’re learning about support vector machines is that they are very theoretical. This theory very easily scares a lot of people away, and it might feel like learning about support vector machines is beyond your ability. Not so!

In this course, we take a very methodical, step-by-step approach to build up all the theory you need to understand how the SVM really works. We are going to use Logistic Regression as our starting point, which is one of the very first things you learn about as a student of machine learning. So if you want to understand this course, just have a good intuition about Logistic Regression, and by extension have a good understanding of the geometry of lines, planes, and hyperplanes.

This course will cover the critical theory behind SVMs:

  • Linear SVM derivation
  • Hinge loss (and its relation to the Cross-Entropy loss)
  • Quadratic programming (and Linear programming review)
  • Slack variables
  • Lagrangian Duality
  • Kernel SVM (nonlinear SVM)
  • Polynomial Kernels, Gaussian Kernels, Sigmoid Kernels, and String Kernels
  • Learn how to achieve an infinite-dimensional feature expansion
  • Projected Gradient Descent
  • SMO (Sequential Minimal Optimization)
  • RBF Networks (Radial Basis Function Neural Networks)
  • Support Vector Regression (SVR)
  • Multiclass Classification

As a VIP bonus, you will also get material for how to apply the “Kernel Trick” to other machine learning models. This is how you can use a model which is normally “weak” (such as linear regression) and make it “strong”. I’ve chosen models from various different areas of machine learning.

  • Kernel Linear regression (for regression)
  • Kernel Logistic regression (for classification)
  • Kernel K-means clustering (for clustering)
  • Kernel Principal components analysis (PCA) (for dimensionality reduction)

Remember – the VIP bonus is only available at https://deeplearningcourses.com/c/support-vector-machines-in-python.

See here what linear regression can be capable of:

And logistic regression:

When the kernel trick is applied!

For those of you who are thinking, “theory is not for me”, there’s lots of material in this course for you too!

In this course, there will be not just one, but two full sections devoted to just the practical aspects of how to make effective use of the SVM.

We’ll do end-to-end examples of real, practical machine learning applications, such as:

  • Image recognition
  • Spam detection
  • Medical diagnosis
  • Regression analysis

For more advanced students, there are also plenty of coding exercises where you will get to try different approaches to implementing SVMs.

These are implementations that you won’t find anywhere else in any other course.

I’ll see you in class!

P.S. As usual, if you primarily use another site (e.g. Udemy) you will automatically get free access (upon request) if you’ve already purchased the VIP version of the course from deeplearningcourses.com.

Get the course now