...

[NEW COURSE] Math 0-1: Calculus for Data Science & Machine Learning

Hello friends!

I am back with a new course. Fun fact: people have been requesting this course for… about 7 years now (since I first started teaching machine learning courses online).

Click here if you don’t want to read my speech: https://deeplearningcourses.com/c/calculus-data-science

I’ve always said that in order to succeed you must follow the prerequisites. While true, I’ve generally found adherence to these instructions inadequate. And furthermore, sometimes students would claim they have a PhD (as an excuse for not following instructions), yet they still sucked at math!

As they say, if you want something done properly, do it yourself. And that is what I am doing today.

This course will cover Calculus 1, 2, AND 3, but will focus on the parts most relevant in machine learning and data science.

Calculus 1: limits, derivatives, derivative rules, optimization, l’Hopital’s rule, Newton’s method

Calculus 2: integration

Calculus 3: calculus in multiple dimensions, chain rule, gradients, Jacobian, Hessian, optimization, Lagrange multipliers

The VIP version will have a special section on the Taylor expansion, which is necessary to understand the chain rule in multiple dimensions. It also serves as motivation for polynomial regression. The VIP version will also contain PDF (LaTeX) notes. And most importantly, the VIP version will contain many more exercises to practice what you’ve learned. It will amount to many more hours of content.

As always, this course became WAY more detailed than I originally planned. I initially aimed for 3 hours, and now it’s over 12 hours. There is still a bit of content to go, but all the essentials are already there.

Content to be completed in the coming days (update: all planned content is complete):

  • PDF notes
  • 3 more lectures in the vector calculus section (chain rule, steepest ascent, optimization & Lagrange multipliers)
  • Summaries for each section
  • Full intro + outline
  • How to succeed (math version)

Anyway, what are you waiting for?