VIP Promotion
Machine Learning: Natural Language Processing in Python (V2)
===The Complete Natural Language Processing Course Has Arrived===
Hello friends!
Welcome to my latest course, on Natural Language Processing (NLP).
Don’t want to read my little spiel? Just click here to get the VIP discount (expires in 30 days – Jan 20, 2022!):
https://www.udemy.com/course/natural-language-processing-in-python/?couponCode=NLPVIP
UPDATE: The opportunity to get the VIP version on Udemy has expired. However, the main part of the course (without the VIP parts) is now available at a new low price. Click here to automatically get the current lowest price: https://bit.ly/3nT5fTX
You can get the FULL VIP version HERE: https://deeplearningcourses.com/c/natural-language-processing-in-python
UPDATE 2: Some of you may see the full price of $199 USD without any discount. This is because promotions going forward will now be decided by Udemy, so you will only get what they give. Such is the downside of not getting the VIP version. From what I hear, promotions happen quite often, so you should not have to wait too long.
UPDATE 3: I’ve updated the above with an actual coupon code, so ALL students should see a discount.
UPDATE 4: For those of you waiting for me to finish the rest of the course (e.g. deep learning sections) that has now been done. I’ve also added a big handful of advanced notebooks to the VIP content! (see “part 6” below)
IMPORTANT INFO: For those of you who missed the VIP discount but still want access to the VIP content, scroll to the bottom of this post. For those who got the VIP version on Udemy and want to access the VIP content for free at its new permanent home, scroll to the bottom of this post.
“Wait a minute… don’t you already have like, 3 courses on NLP?”
Yes!
My first NLP course was released over 5 years ago. While there have been updates to it over the years, it has turned into a Frankenstein monster of sorts.
Therefore, the logical action was to simply start anew.
This course is another MASSIVE one – I say it’s basically 4 courses in 1 (not including the VIP section).
One of those “courses” (the ML part) is a revamp of my original 2016 NLP course. And therefore, this new course is actually a superset of NLP V1. The TL;DR: way more content, better organization.
Let’s get to the details:
Part 1: Vector models and text-preprocessing
- Tokenization, stemming, lemmatization, stopwords, etc.
- CountVectorizer and TF-IDF
- Basic intro to word2vec and GloVe
- Build a text classifier
- Build a recommendation engine
Part 2: Probability models
- Markov models and language models
- Article spinner
- Cipher decryption
Part 3: Machine learning
- Spam detection with Naive Bayes
- Sentiment analysis with Logistic Regression
- Text summarization with TF-IDF and TextRank
- Topic modeling with Latent Dirichlet Allocation and Non-negative Matrix Factorization*
- Latent semantic indexing (LSI / LSA) with PCA / SVD*
- VIP only: Applying LSI to text summarization, topic modeling, classification, and recommendations*
Part 4: Deep learning*
- Embeddings
- Feedforward ANNs
- CNNs
- RNNs / LSTMs
Part 5: Beginner’s Corner on Transformers with Hugging Face (VIP only)
- Sentiment analysis revisit
- Text generation revisit
- Article spinning revisit
- Question-answering
- Zero-shot classification
Part 6: Even MORE bonus VIP notebooks (VIP only)
- Stock Movement Prediction Using News
- LSA / LSI for Recommendations
- LSA / LSI for Classification (Feature Engineering)
- LSA / LSI for Topic Modeling
- LSA / LSI for Text Summarization (2 methods)
- LSTM for Text Generation Notebook (i.e. the “decoder” part of an encoder-decoder network)
- Masked language model with LSTM Notebook (revisiting the article spinner)
I’m sure many of you are most excited about the Transformers VIP section. Please note that this is not a full course on Transformers. As you know, I like to go very in-depth and as such, this is a topic which deserves its own course. This VIP section is a “beginner’s corner”-style set of lectures, which outlines the tasks that Transformers can do (listed above), along with code examples for each task. The Transformers “code” is very simple – basically just 1 or 2 lines. Don’t worry, the actual notebooks are much longer than that, and demonstrate real meaningful use-cases. The Transformer-specific part is just 1 or 2 lines – and that is great for practical purposes. It does not show you how to train or fine-tune a Transformer, only how to use existing models. If you just want to use Transformers and make use of these state-of-the-art models, but you don’t care about the nitty gritty details, this is perfect for you.
Is the VIP section only ideal for beginners? NO! Despite the name, this section will be useful for everyone, especially those who are interested in Transformers. This is quite a complex topic, and getting “good” with Transformers really requires a step-by-step approach. Think of this as the first step.
What is the “VIP version”? As usual, the VIP version of the course contains extra VIP content only available to those who purchase the course during the VIP period (i.e now). This content will be removed when it becomes a regular, non-VIP course, at which point I will make an announcement. All who sign up for the VIP version will retain access to the VIP content forever via my website, simply by letting me know via email you’d like access (you only need to email if I announce the VIP period is ending).
NOTE: If you are interested in Transformers, a lot of this course contains important prerequisites. The language models and article spinner from part 2 (“probability models”) are very important for understanding pre-training methods. The deep learning sections are very important for learning about embeddings and how neural networks deal with sequences.
NOTE: As per the last few releases, I’ve wanted to get the course into your hands as early as possible. Some sections are still in progress, specifically, those denoted with an asterisk (*) above. UPDATE: All post-release sections have been uploaded!
So what are you waiting for? Get the VIP version of Natural Language Processing (V2) NOW:
For those who missed the VIP version but still want it:
Yes, you can still get the VIP contents! They can now be purchased separately on deeplearningcourses.com.
You can get it here: https://
For those of you who already purchased the VIP version and want to get setup with the VIP content on deeplearningcourses.com:
Email me with your name (exactly as it appears on Udemy) along with your date of purchase. I will look up your details to confirm.
If required, read more about how the “VIP version” works here: https://lazyprogrammer.
Does this course replace “Natural Language Processing with Deep Learning in Python”, or “Deep Learning: Advanced NLP and RNNs”?
In fact, this course replaces neither of these more advanced NLP courses.
Let’s first consider “Natural Language Processing with Deep Learning in Python”.
This course covers more advanced topics, generally.
For instance, both variants of word2vec (skip-gram and CBOW) are discussed in detail and implemented from scratch. In the current course, only the very basic ideas are discussed.
Another word embedding algorithm called GloVe is taught in detail, along with a from-scratch implementation. In the current course, again it is only mentioned very briefly.
This course reviews RNNs, but goes into great detail on a completely new architecture, the “Recursive Neural Tensor Network”.
Essentially, this is a neural network structured like a tree, which is very useful for tasks such as sentiment analysis where negation of whole phrases may be desired (and easily accomplished with a tree structure).
How about “Deep Learning: Advanced NLP and RNNs”?
Again, there is essentially no overlap.
As the title suggests, this course covers more advanced topics. Like the previously mentioned course, it can be thought of as another sequel to the current course.
This course covers topics such as: bidirectional RNNs, seq2seq (for many-to-many tasks where the input length is not equal to the target length), attention (the central mechanism in transformers), and memory networks.