Hello friends!
Click here if you don’t want to read my little spiel: https://
Alternate Udemy discount (expires Oct 4, 2023): https://www.udemy.com/course/linear-algebra-data-science/?couponCode=LINEARVIP2
Most of you already saw this coming…
After the release of my long-awaited Calculus course (https://deeplearningcourses.
Linear Algebra is all about how to deal with vectors and matrices, and understanding how they behave differently from “regular” numbers (scalars).
This course will be VIP-only (there will not be any non-VIP version). The course is already quite long, but there is one section still in the works, on eigenvectors and eigenvalues (my favorite part).
Here’s what the course covers:
- Review of linear systems and Gaussian elimination (high school math)
- Matrices and vectors, basic operations (addition, subtraction, dot product, multiplication)
- Special matrix and matrix operations
- Identity, diagonal, symmetric, orthogonal, positive definite, … matrices
- Inverse, transpose, determinant, trace, …
- Matrix rank and low-rank approximations
- Brief intro to matrix decompositions / factorizations (Cholesky, LU, QR, SVD)
- Applications to relevant concepts in modern machine learning
- Neural embeddings
- Vector similarity
- Deep neural networks
- Vanishing gradient problem
- How to build GPT-4
- Fine-tuning with LoRA for diffusion models and LLMs
- Recommender systems
- Topic modeling
“But I already know about matrices and vectors!” you might say… if this is you, I totally get it. You might think that there’s nothing more to learn other than that matrices and vectors are like arrays, that you can add/subtract them if they’re the same size, and matrix multiplication is a bit funny (it’s more like a dot product), and because of that, finding the inverse of a matrix is “complicated work”. This is an okay rudimentary understanding that might help you with some basic ML concepts, but I encourage you to look at the exercises in the course curriculum and see if you can answer these questions:
- What’s an easy way to find a vector that is normal to a plane?
- Why do orthogonal matrices only rotate vectors, and why is it always by the same amount?
- What’s the inverse of a product of matrices? (how do you distribute the inverse operation)
- What’s the transpose of the inverse of a symmetric matrix?
- What’s the determinant of a unitary matrix?
- Is the inverse of a positive definite matrix also positive definite? If so, why?
- Can you complete the square of a quadratic vector function?
- What’s the rule for determining the rank of a product of matrices, given the ranks of the matrices being multiplied?
- How can you generate a positive semi-definite matrix?
- Why Do A^TA and AA^T Have the Same Eigenvalues?
- How can you compute functions of matrices? E.g. what does exp(A), sin(A), cos(A), sqrt(A), etc… mean?
- When Will Eigenvalues Be Real-Valued?
- Why Do Hermitian Matrices Have Orthogonal Eigenvectors?
- How can you test for positive-definiteness using eigenvalues?
- Are positive definite matrices always invertible?
If you can already answer all of these questions, great! If not, there may be more you need to learn…
Content to be completed in the coming days (update: complete):
PDF notesEigenvectors + eigenvaluesSummaries for each section
Anyway, what are you waiting for?