November 14, 2018
November 14, 2018
August 21, 2017
It’s that time again!
BIG DISCOUNTS for everyone! If you’re in the USA you should see $10 coupons. If you’re in another country you’ll see the corresponding amount in your own currency.
But before we get to that, I want to mention that the VIP bonus for my latest Deep Learning course on GANs and Variational Autoencoders is CLOSING TODAY.
So if you want to get the VIP bonus and you haven’t gotten it yet, NOW is the time!
Just a reminder of what you get:
1) PDF cheatsheet / tutorial on Variational Autoencoders for your reading convenience
2) PDF cheatsheet / tutorial on GANs for your reading convenience (with exercises)
3) Pre-trained style transfer network! No need to train for 4 months on your slow CPU, or pay hundreds of dollars to use a GPU, or download 100s of MBs of Tensorflow checkpoint data! I’ve condensed the neural network weights to a few MBs so you can get going right away.
If you don’t know what “style transfer” is – that’s where I train a neural network to learn the “style” of Picasso or Da Vinci, and then apply it to a completely unrelated image like the Chicago skyline.
Very cool application of neural networks!
Remember: these VIP bonuses are ONLY available if you use the VIP coupon, which is automatically applied when you click this link:
Now, for the regular $10 discounts (check the end of this newsletter for how to get $10 coupons for ANY course on Udemy this week!):
Deep Learning Prerequisites: Linear Regression in Python https://www.udemy.com/data-science-linear-regression-in-python/?couponCode=AUG456
Deep Learning Prerequisites: Logistic Regression in Python https://www.udemy.com/data-science-logistic-regression-in-python/?couponCode=AUG456
Deep Learning in Python https://www.udemy.com/data-science-deep-learning-in-python/?couponCode=AUG456
Practical Deep Learning in Theano and TensorFlow https://www.udemy.com/data-science-deep-learning-in-theano-tensorflow/?couponCode=AUG456
Deep Learning: Convolutional Neural Networks in Python https://www.udemy.com/deep-learning-convolutional-neural-networks-theano-tensorflow/?couponCode=AUG456
Unsupervised Deep Learning in Python https://www.udemy.com/unsupervised-deep-learning-in-python/?couponCode=AUG456
Deep Learning: Recurrent Neural Networks in Python https://www.udemy.com/deep-learning-recurrent-neural-networks-in-python/?couponCode=AUG456
Advanced Natural Language Processing: Deep Learning in Python https://www.udemy.com/natural-language-processing-with-deep-learning-in-python/?couponCode=AUG456
Advanced AI: Deep Reinforcement Learning in Python https://www.udemy.com/deep-reinforcement-learning-in-python/?couponCode=AUG456
Easy Natural Language Processing in Python https://www.udemy.com/data-science-natural-language-processing-in-python/?couponCode=AUG456
Cluster Analysis and Unsupervised Machine Learning in Python https://www.udemy.com/cluster-analysis-unsupervised-machine-learning-python/?couponCode=AUG456
Unsupervised Machine Learning: Hidden Markov Models in Python https://www.udemy.com/unsupervised-machine-learning-hidden-markov-models-in-python/?couponCode=AUG456
Data Science: Supervised Machine Learning in Python https://www.udemy.com/data-science-supervised-machine-learning-in-python/?couponCode=AUG456
Bayesian Machine Learning in Python: A/B Testing https://www.udemy.com/bayesian-machine-learning-in-python-ab-testing/?couponCode=AUG456
Ensemble Machine Learning in Python: Random Forest and AdaBoost https://www.udemy.com/machine-learning-in-python-random-forest-adaboost/?couponCode=AUG456
Artificial Intelligence: Reinforcement Learning in Python https://www.udemy.com/artificial-intelligence-reinforcement-learning-in-python/?couponCode=AUG456
Deep Learning: GANs and Variational Autoencoders https://www.udemy.com/deep-learning-gans-and-variational-autoencoders/?couponCode=AUG456
SQL for Newbs and Marketers https://www.udemy.com/sql-for-marketers-data-analytics-data-science-big-data/?couponCode=AUG456
PREREQUISITE COURSE COUPONS
Last but not least, $10 coupons for some helpful prerequisite courses. You NEED to know this stuff before you study machine learning:
Calc 1 http://bit.ly/2okPUib
Calc 2 http://bit.ly/2oXnhpX
Calc 3 http://bit.ly/2pVU0gQ
Linalg 1 http://bit.ly/2oBBir1
Linalg 2 http://bit.ly/2q5SGEE
Probability (option 1) http://bit.ly/2prFQ7o
Probability (option 2) http://bit.ly/2p8kcC0
Probability (option 3) http://bit.ly/2oXa2pb
Probability (option 4) http://bit.ly/2oXbZSK
OTHER UDEMY COURSE COUPONS
As you know, I’m the “Lazy Programmer”, not just the “Lazy Data Scientist” – I love all kinds of programming!
If you have friends who are into any of these topics, do them a favor and let them know about these amazing discounts:
Ruby on Rails courses:
Big Data (Spark + Hadoop) courses:
EVEN MORE COOL STUFF
Into Yoga in your spare time? Photography? Painting? There are courses, and I’ve got coupons! If you find a course on Udemy that you’d like a coupon for, just let me know and I’ll hook you up!
Remember, these links will self-destruct on August 31 (10 days). Act NOW!Go to comments
January 27, 2017
I would like to announce my latest course – Artificial Intelligence: Reinforcement Learning in Python.
This has been one of my most requested topics since I started covering deep learning. This course has been brewing in the background for months.
The result: This is my most MASSIVE course yet.
Usually, my courses will introduce you to a handful of new algorithms (which is a lot for people to handle already). This course covers SEVENTEEN (17!) new algorithms.
This will keep you busy for a LONG time.
If you’re used to supervised and unsupervised machine learning, realize this: Reinforcement Learning is a whole new ball game.
There are so many new concepts to learn, and so much depth. It’s COMPLETELY different from anything you’ve seen before.
That’s why we build everything slowly, from the ground up.
There’s tons of new theory, but as you’ve come to expect, anytime we introduce new theory it is accompanied by full code examples.
What is Reinforcement Learning? It’s the technology behind self-driving cars, AlphaGo, video game-playing programs, and more.
You’ll learn that while deep learning has been very useful for tasks like driving and playing Go, it’s in fact just a small part of the picture.
Reinforcement Learning provides the framework that allows deep learning to be useful.
Without reinforcement learning, all we have is a basic (albeit very accurate) labeling machine.
With Reinforcement Learning, you have intelligence.
Reinforcement Learning has even been used to model processes in psychology and neuroscience. It’s truly the closest thing we have to “machine intelligence” and “general AI”.
What are you waiting for? Sign up now!!
COUPON:#artificial intelligence #deep learning #reinforcement learning
September 16, 2016
If you don’t want to read about the course and just want the 88% OFF coupon code, skip to the bottom.
In recent years, we’ve seen a resurgence in AI, or artificial intelligence, and machine learning.
Machine learning has led to some amazing results, like being able to analyze medical images and predict diseases on-par with human experts.
Google’s AlphaGo program was able to beat a world champion in the strategy game go using deep reinforcement learning.
Machine learning is even being used to program self driving cars, which is going to change the automotive industry forever. Imagine a world with drastically reduced car accidents, simply by removing the element of human error.
Google famously announced that they are now “machine learning first”, meaning that machine learning is going to get a lot more attention now, and this is what’s going to drive innovation in the coming years.
Machine learning is used in many industries, like finance, online advertising, medicine, and robotics.
It is a widely applicable tool that will benefit you no matter what industry you’re in, and it will also open up a ton of career opportunities once you get good.
Machine learning also raises some philosophical questions. Are we building a machine that can think? What does it mean to be conscious? Will computers one day take over the world?
The best part about this course is that it requires WAY less math than my usual courses; just some basic probability and geometry, no calculus!
In this course, we are first going to discuss the K-Nearest Neighbor algorithm. It’s extremely simple and intuitive, and it’s a great first classification algorithm to learn. After we discuss the concepts and implement it in code, we’ll look at some ways in which KNN can fail.
It’s important to know both the advantages and disadvantages of each algorithm we look at.
Next we’ll look at the Naive Bayes Classifier and the General Bayes Classifier. This is a very interesting algorithm to look at because it is grounded in probability.
We’ll see how we can transform the Bayes Classifier into a linear and quadratic classifier to speed up our calculations.
Next we’ll look at the famous Decision Tree algorithm. This is the most complex of the algorithms we’ll study, and most courses you’ll look at won’t implement them. We will, since I believe implementation is good practice.
The last algorithm we’ll look at is the Perceptron algorithm. Perceptrons are the ancestor of neural networks and deep learning, so they are important to study in the context of machine learning.
One we’ve studied these algorithms, we’ll move to more practical machine learning topics. Hyperparameters, cross-validation, feature extraction, feature selection, and multiclass classification.
We’ll do a comparison with deep learning so you understand the pros and cons of each approach.
We’ll discuss the Sci-Kit Learn library, because even though implementing your own algorithms is fun and educational, you should use optimized and well-tested code in your actual work.
We’ll cap things off with a very practical, real-world example by writing a web service that runs a machine learning model and makes predictions. This is something that real companies do and make money from.
All the materials for this course are FREE. You can download and install Python, Numpy, and Scipy with simple commands on Windows, Linux, or Mac.
UPDATE: New coupon if the above is sold out:#data science #machine learning #matplotlib #numpy #pandas #python
July 14, 2016
New course out today – Recurrent Neural Networks in Python: Deep Learning part 5.
If you already know what the course is about (recurrent units, GRU, LSTM), grab your 50% OFF coupon and go!:
Like the course I just released on Hidden Markov Models, Recurrent Neural Networks are all about learning sequences – but whereas Markov Models are limited by the Markov assumption, Recurrent Neural Networks are not – and as a result, they are more expressive, and more powerful than anything we’ve seen on tasks that we haven’t made progress on in decades.
Sequences appear everywhere – stock prices, language, credit scoring, and webpage visits.
Recurrent neural networks have a history of being very hard to train. It hasn’t been until recently that we’ve found ways around what is called the vanishing gradient problem, and since then, recurrent neural networks have become one of the most popular methods in deep learning.
If you took my course on Hidden Markov Models, we are going to go through a lot of the same examples in this class, except that our results are going to be a lot better.
Our classification accuracies will increase, and we’ll be able to create vectors of words, or word embeddings, that allow us to visualize how words are related on a graph.
We’ll see some pretty interesting results, like that our neural network seems to have learned that all religions and languages and numbers are related, and that cities and countries have hierarchical relationships.
If you’re interested in discovering how modern deep learning has propelled machine learning and data science to new heights, this course is for you.
I’ll see you in class.
Click here for 50% OFF:#data science #deep learning #gru #lstm #machine learning #word vectors
June 13, 2016
Hidden Markov Models are all about learning sequences.
A lot of the data that would be very useful for us to model is in sequences. Stock prices are sequences of prices. Language is a sequence of words. Credit scoring involves sequences of borrowing and repaying money, and we can use those sequences to predict whether or not you’re going to default. In short, sequences are everywhere, and being able to analyze them is an important skill in your data science toolbox.
The easiest way to appreciate the kind of information you get from a sequence is to consider what you are reading right now. If I had written the previous sentence backwards, it wouldn’t make much sense to you, even though it contained all the same words. So order is important.
While the current fad in deep learning is to use recurrent neural networks to model sequences, I want to first introduce you guys to a machine learning algorithm that has been around for several decades now – the Hidden Markov Model.
This course follows directly from my first course in Unsupervised Machine Learning for Cluster Analysis, where you learned how to measure the probability distribution of a random variable. In this course, you’ll learn to measure the probability distribution of a sequence of random variables.
You guys know how much I love deep learning, so there is a little twist in this course. We’ve already covered gradient descent and you know how central it is for solving deep learning problems. I claimed that gradient descent could be used to optimize any objective function. In this course I will show you how you can use gradient descent to solve for the optimal parameters of an HMM, as an alternative to the popular expectation-maximization algorithm.
We’re going to do it in Theano, which is a popular library for deep learning. This is also going to teach you how to work with sequences in Theano, which will be very useful when we cover recurrent neural networks and LSTMs.
This course is also going to go through the many practical applications of Markov models and hidden Markov models. We’re going to look at a model of sickness and health, and calculate how to predict how long you’ll stay sick, if you get sick. We’re going to talk about how Markov models can be used to analyze how people interact with your website, and fix problem areas like high bounce rate, which could be affecting your SEO. We’ll build language models that can be used to identify a writer and even generate text – imagine a machine doing your writing for you.
We’ll look at what is possibly the most recent and prolific application of Markov models – Google’s PageRank algorithm. And finally we’ll discuss even more practical applications of Markov models, including generating images, smartphone autosuggestions, and using HMMs to answer one of the most fundamental questions in biology – how is DNA, the code of life, translated into physical or behavioral attributes of an organism?
All of the materials of this course can be downloaded and installed for FREE. We will do most of our work in Numpy and Matplotlib, along with a little bit of Theano. I am always available to answer your questions and help you along your data science journey.#data science #deep learning #hidden markov models #machine learning #recurrent neural networks #theano
May 15, 2016
This course is the next logical step in my deep learning, data science, and machine learning series. I’ve done a lot of courses about deep learning, and I just released a course about unsupervised learning, where I talked about clustering and density estimation. So what do you get when you put these 2 together? Unsupervised deep learning!
In these course we’ll start with some very basic stuff – principal components analysis (PCA), and a popular nonlinear dimensionality reduction technique known as t-SNE (t-distributed stochastic neighbor embedding).
Next, we’ll look at a special type of unsupervised neural network called the autoencoder. After describing how an autoencoder works, I’ll show you how you can link a bunch of them together to form a deep stack of autoencoders, that leads to better performance of a supervised deep neural network. Autoencoders are like a non-linear form of PCA.
Last, we’ll look at restricted Boltzmann machines (RBMs). These are yet another popular unsupervised neural network, that you can use in the same way as autoencoders to pretrain your supervised deep neural network. I’ll show you an interesting way of training restricted Boltzmann machines, known as Gibbs sampling, a special case of Markov Chain Monte Carlo, and I’ll demonstrate how even though this method is only a rough approximation, it still ends up reducing other cost functions, such as the one used for autoencoders. This method is also known as Contrastive Divergence or CD-k. As in physical systems, we define a concept called free energy and attempt to minimize this quantity.
Finally, we’ll bring all these concepts together and I’ll show you visually what happens when you use PCA and t-SNE on the features that the autoencoders and RBMs have learned, and we’ll see that even without labels the results suggest that a pattern has been found.
All the materials used in this course are FREE. Since this course is the 4th in the deep learning series, I will assume you already know calculus, linear algebra, and Python coding. You’ll want to install Numpy andTheano for this course. These are essential items in your data analytics toolbox.
If you are interested in deep learning and you want to learn about modern deep learning developments beyond just plain backpropagation, including using unsupervised neural networks to interpret what features can be automatically and hierarchically learned in a deep learning system, this course is for you.
Get your EARLY BIRD coupon for 50% off here: https://www.udemy.com/unsupervised-deep-learning-in-python/?couponCode=EARLYBIRDGo to comments
April 20, 2016
[Scroll to the bottom if you want to jump straight to the coupon]
Cluster analysis is a staple of unsupervised machine learning and data science.
It is very useful for data mining and big data because it automatically finds patterns in the data, without the need for labels, unlike supervised machine learning.
In a real-world environment, you can imagine that a robot or an artificial intelligence won’t always have access to the optimal answer, or maybe there isn’t an optimal correct answer. You’d want that robot to be able to explore the world on its own, and learn things just by looking for patterns.
Do you ever wonder how we get the data that we use in our supervised machine learning algorithms?
We always seem to have a nice CSV or a table, complete with Xs and corresponding Ys.
If you haven’t been involved in acquiring data yourself, you might not have thought about this, but someone has to make this data!
Those “Y”s have to come from somewhere, and a lot of the time that involves manual labor.
Sometimes, you don’t have access to this kind of information or it is infeasible or costly to acquire.
But you still want to have some idea of the structure of the data. If you’re doing data analytics automating pattern recognition in your data would be invaluable.
This is where unsupervised machine learning comes into play.
In this course we are first going to talk about clustering. This is where instead of training on labels, we try to create our own labels! We’ll do this by grouping together data that looks alike.
There are 2 methods of clustering we’ll talk about: k-means clustering and hierarchical clustering.
Next, because in machine learning we like to talk about probability distributions, we’ll go into Gaussian mixture models and kernel density estimation, where we talk about how to “learn” the probability distribution of a set of data.
One interesting fact is that under certain conditions, Gaussian mixture models and k-means clustering are exactly the same! You can think of GMMs as a “souped up” version of k-means. We’ll prove how this is the case.
All the algorithms we’ll talk about in this course are staples in machine learning and data science, so if you want to know how to automatically find patterns in your data with data mining and pattern extraction, without needing someone to put in manual work to label that data, then this course is for you.
All the materials for this course are FREE. You can download and install Python, Numpy, and Scipy with simple commands on Windows, Linux, or Mac.#agglomerative clustering #cluster analysis #data mining #data science #expectation-maximization #gaussian mixture model #hierarchical clustering #k-means clustering #kernel density estimation #pattern recognition #udemy #unsupervised machine learning
March 19, 2016
This is an annoucement along with free and discount coupons for my new course, SQL for Marketers: Dominate data analytics, data science, and big data
More and more companies these days are learning that they need to make DATA-DRIVEN decisions.
With big data and data science on the rise, we have more data than we know what to do with.
One of the basic languages of data analytics is SQL, which is used for many popular databases including MySQL, Postgres, Microsoft SQL Server, Oracle, and even big data solutions like Hive and Cassandra.
I’m going to let you in on a little secret. Most high-level marketers and product managers at big tech companies know how to manipulate data to gain important insights. No longer do you have to wait around the entire day for some software engineer to answer your questions – now you can find the answers directly, by yourself, using SQL.
Do you want to know how to optimize your sales funnel using SQL, look at the seasonal trends in your industry, and run a SQL query on Hadoop? Then join me now in my new class, SQL for marketers: Dominate data analytics, data science, and big data!
P.S. If you haven’t yet signed up for my newsletter at lazyprogrammer [dot] me, you’ll want to do so before Monday, especially if you want to learn more about deep learning, because I have a special announcement coming up that will NOT be announced on Udemy.
Here’s the coupons:
FREE coupon for early early birds:
EARLYBIRD (Sold out)
If the first coupon has run out, you may still use the 2nd coupon, which gives you 70% off:#aws #big data #cassandra #Data Analytics #ec2 #hadoop #Hive #Microsoft SQL Server #MySQL #Oracle #Postgres #S3 #spark #sql #sqlite
February 26, 2016
This course continues where my first course, Deep Learning in Python, left off. You already know how to build an artificial neural network in Python, and you have a plug-and-play script that you can use for TensorFlow.
You learned about backpropagation (and because of that, this course contains basically NO MATH), but there were a lot of unanswered questions. How can you modify it to improve training speed? In this course you will learn about batch and stochastic gradient descent, two commonly used techniques that allow you to train on just a small sample of the data at each iteration, greatly speeding up training time.
You will also learn about momentum, which can be helpful for carrying you through local minima and prevent you from having to be too conservative with your learning rate. You will also learn aboutadaptive learning rate techniques like AdaGrad and RMSprop which can also help speed up your training.
In my last course, I just wanted to give you a little sneak peak at TensorFlow. In this course we are going to start from the basics so you understand exactly what’s going on – what are TensorFlow variables and expressions and how can you use these building blocks to create a neural network? We are also going to look at a library that’s been around much longer and is very popular for deep learning – Theano. With this library we will also examine the basic building blocks – variables, expressions, and functions – so that you can build neural networks in Theano with confidence.
Because one of the main advantages of TensorFlow and Theano is the ability to use the GPU to speed up training, I will show you how to set up a GPU-instance on AWS and compare the speed of CPU vs GPU for training a deep neural network.
With all this extra speed, we are going to look at a real dataset – the famous MNIST dataset (images of handwritten digits) and compare against various known benchmarks.#adagrad #aws #batch gradient descent #deep learning #ec2 #gpu #machine learning #nesterov momentum #numpy #nvidia #python #rmsprop #stochastic gradient descent #tensorflow #theano