Quite a few students in my SQL course have asked me about how to install SQLite on Windows.

On the surface, it seems pretty simple: download some files, unzip them, put them into C:\sqlite, and add C:\sqlite to your PATH.

Most people who use Windows don’t have experience with setting environment variables or using the command line and so, I’ve decided to create this visual walkthrough to help.

Earlier this week, I mentioned Udemy was doing a promotion on Tech courses only (if you were signed up for my newsletter you would have gotten the announcement). I’ve just heard news that they’ve opened up the $10 sale to ALL courses for the next 3 days only!

What this means: All my courses will continue to be on sale for $10 (just click the below links). But in addition, you can find other courses (including calculus and probability prerequisites) for $10 too!

With some time off, now is the PERFECT time to catch up on your deep learning / machine learning / data science skills. It’s almost 2018 and AI is rising faster than ever.

What better way than to grab all the deep learning courses you’ll ever want to take, for just $10?

Don’t forget, this is the LOWEST possible price on Udemy – get these courses NOW. We really don’t know when the next big sale is going to be.

If you want to type in the coupon code manually, it’s: WINTER2017 (remember, this is only for my courses). However, the coupon codes are included automatically in the links below.

This sale lasts until Dec. 21 (3 days). Don’t wait!

Into Yoga in your spare time? Photography? Painting? There are courses, and I’ve got coupons! If you find a course on Udemy that you’d like a coupon for, just let me know and I’ll hook you up!
Remember, these links will self-destruct on December 21 (3 days). Act NOW!

A lot of you have been asking me… “When is the $10 sale coming back?”

And as you know, I share the news as soon as I find out – so here it is.

Black Friday is THE BIGGEST SALE OF THE YEAR.

This is the lowest price possible on Udemy.

I always make sure to mention to everyone: grab everything while you can because we just don’t know when the next big sale is going to be!

Don’t get stuck for months wondering… “When is the next $10 sale coming back?” Just get everything now! (Even if you don’t plan on taking the course for some time.)

Enough babbling, let’s get to the coupons. Remember: there’s no need to type in the coupon code manually – I’ve already provided the links so all you need to do is click and add to cart!

But just in case you’re curious – the coupon code is BLACKFRIDAY2017.

Also, make sure you scroll down to the bottom for some important updates.

In my last post, I relaunched my course “Modern Deep Learning in Python” which has more than doubled in size since its inception. Along with this re-release I offered a special VIP version of the course where you get a free 28-page tutorial on Tensorflow’s new Estimator API.

As mentioned, this deal is not going to last. In fact, 24 hours from now, it will be GONE FOREVER. Remember: you MUST use the VIP coupon to get the VIP material.

PREREQUISITE COURSE COUPONS

And just as important, $10 coupons for some helpful prerequisite courses. You NEED to know this stuff before you study machine learning:

Into Yoga in your spare time? Photography? Painting? There are courses, and I’ve got coupons! If you find a course on Udemy that you’d like a coupon for, just let me know and I’ll hook you up!
Remember, these links will self-destruct on November 28 (13 days). Act NOW!

If you’ve been following my updates, you may have noticed that I’ve been hard at work doubling the size of my course, “Deep Learning part 2”, otherwise known as “Practical Deep Learning in Theano and Tensorflow”.

I’ve since renamed the course to “Modern Deep Learning in Python” and I am officially re-launching it today!

At this point, I have completed all the major updates I’ve had in my pipeline: extending the modern regularization section, adding batch normalization, Adam, and more code for other modern libraries like Keras, PyTorch, CNTK, and MXNet.

As part of this relaunch, I am releasing a VIP version of this course.

What do you get?

Well, in addition to the HOURS of free content I’ve just added to the course, you may have heard of Tensorflow’s new Estimator API.

It was released just a few months ago.

Why is it better?

Greatly simplifies machine learning programming

No need to deal with Graphs or Sessions

Encapsulates training, evaluation, prediction, and exporting models

Provides standard models so you don’t have to write any of the code yourself

You may have noticed that writing Tensorflow code can be quite repetitive. We need to define each layer, combine the layers to calculate the output, create a loss function, create an optimizer, initialize the variables, run the optimizer, plot the loss, and so on.

All “boilerplate” stuff (although helpful to repeat if you are in the process of learning Tensorflow).

But while the Estimator API simplifies machine learning programming, it is not necessarily easy. And hence, I’ve written a 28-page tutorial to teach you how to use it from the ground up.

We start out with the Sci-Kit Learn API, and gradually build on those ideas to familiarize ourselves with the new Estimator API.

We go through a FULL CODE example on a dataset NEVER-BEFORE SEEN in my courses. Some new data wrangling techniques will be taught, in particular: the “hashing trick” and how to create embeddings instead of one-hot encoding categorical variables.

[Note: make SURE you use the coupon code IAMAVIP2 – those who do not use the code will not get the VIP material!]

—

Now I realize that a lot of you might already be signed up for this course. That means, you get all the updated content for free and have been seeing the updates come through as I’ve added them. Woohoo!

I’ve temporarily decreased the price to $10. Courses on deeplearningcourses.com will always contain the extra VIP material.

You can think of it like a small donation as a token of appreciation, but don’t forget you are still getting this 28-PAGE easy-to-follow tutorial on the Estimator API. Believe me, I’ve looked at other resources out there – they were not fun reading.

So, if you are already a student of Modern Deep Learning in Python and you would like to access the VIP stuff, get it now! This price won’t last.

—

To be sure: I still have TONS of stuff I still want to add to this course and my other courses. So much that sometimes I wonder how I’m going to get it all done! If you support this effort and you want to see MORE of it in the future, please do consider getting the VIP version of this course. I am very thankful for all the support!

Note: if you order the VIP version of the course through Udemy, you should receive a link to the VIP material within 24 hours of purchase in your message inbox. So don’t forget to check your messages! Shoot me a message if you haven’t got your VIP material by that time.

I’ve been really busy adding tons of free updates to my existing courses. You can scroll down to the very bottom to see what they are. But in the mean time we are going to do another HUGE sale. ALL courses on Udemy are now $12. Take this opportunity to grab as many courses as you can because you never know when the next sale is going to be!

As usual, I’m providing $12 coupons for all my courses in the links below. Please use these links and share them with your friends!

You can also just type in the coupon code “OCT456”.

The promo goes until October 31. Don’t wait!

At the end of this post, I’m going to provide you with some additional links to get machine learning prerequisites (calculus, linear algebra, Python, etc…) for $12 too!

But that’s not all… I’m the Lazy Programmer, not just the Lazy Data Scientist – I’ve got $12 coupons for iOS development, Android development, Ruby on Rails, Python, Big Data / Hadoop / Spark, React.js, Angular, and MORE. All important skillsets on ANY engineering team. Got any friends or coworkers in mobile / backend / big data development? Let them know!

Into Yoga in your spare time? Photography? Painting? There are courses, and I’ve got coupons! If you find a course on Udemy that you’d like a coupon for, just let me know and I’ll hook you up!

COURSE UPDATES

Recent updates to existing courses because my students are awesome and deserve free stuff:

Deep Learning pt 2: More lectures on hyperparameter optimization

Deep Learning pt 2: Keras!!! (very popular request)

Deep Learning pt 2: Noise injection

Deep Learning pt 1: Why Learn the Ins and Outs of Backpropagation?

All relevant courses: How to uncompress a .tar.gz file

Remember, these links will self-destruct on October 31 (4 days). Act NOW!

The Google Brain team has just released a new paper (https://arxiv.org/abs/1710.05941) that demonstrates the superiority of a new activation function called Swish on a number of different neural network architectures.

This is interesting because people often ask me, “which activation function should I use?”

These days, it is common to just use the ReLU by default.

To refresh your memory, the ReLU looks like this:

And it is defined by the equation:

$$ f(x) = max(0, x) $$

One major problem with the ReLU is that its derivative is 0 for half the values of the input \( x \). Because we use “gradient descent” as our parameter update algorithm, if the gradient is 0 for a parameter, then that parameter will not be updated!

This leads to the problem of “dead neurons”. Experiments have shown that neural networks trained with ReLUs can have up to 40% dead neurons!

There have been some proposed alternatives to this, such as the leaky ReLU, the ELU, and the SELU.

Interestingly, none of these have seemed to catch on and it’s still ReLU by default.

So how does the Swish activation function work?

The function itself is very simple:

$$ f(x) = x \sigma(x) $$

Where \( \sigma(x) \) is the usual sigmoid activation function.

$$ \sigma(x) = (1 + e^{-x})^{-1} $$

It looks like this:

What’s interesting about this is that unlike every other activation function, it is not monotonically increasing. Does it matter? It seems the answer is no!

The derivative looks like this:

One interesting thing we can do is re-parameterize the Swish, in order to “stretch out” the sigmoid:

$$ f(x) = 2x \sigma(\beta x) $$

We can see that, if \( \beta = 0 \), then we get the identity activation \( f(x) = x \), and if \( \beta \rightarrow \infty \) then the sigmoid converges to the unit step and multiplying that by \( x \) gives us back \( f(x) = 2 max(0, x) \) which is just the ReLU multiplied by a constant factor.

So including \( \beta \) is a way for us to nonlinearly interpolate between identity and ReLU.

The title of the paper is “A Self-Gated Activation Function”, which might make you wonder, “Why is it self-gated?”

This should remind you of the LSTM, where we have “gates” in the form of sigmoids that control how much of a vector gets passed on to the next stage, by multiplying it between the output of the sigmoid, which is a number between 0 and 1.

So “self-gated” means that the gate is just the sigmoid of the activation itself.

Gate: \( \sigma(x) \)

Value to pass through: \( x \)

But that’s enough theory. For most of us, we want to know: “Does it work?”

And more practically, “Can I just use this by default instead of the ReLU?”

The best thing to do is just to try it for yourself and see how robust it is to different settings of hyperparameters (learning rate, architecture, etc.) but let’s look at some results so we can be confident when it comes to using Swish:

Click on the image to see it in the original size.

To compare Swish with baseline, a statistical test called the one-sided paired sign test was used.

This is a short post to help those of you who need help translating code from Python 2 to Python 3.

Python 2 is the most popular Python version (at least at this time and certainly at the time my courses were created), hence why it was used.

It comes with Mac OS and Ubuntu pre-installed so when you type in “python” into your command line, you get Python 2.

This list is not exhaustive. It shows only code that appears commonly in my machine learning scripts, to assist the students taking my machine learning courses (https://deeplearningcourses.com).

Integer Division

OLD:

a / b

NEW:

a // b

For Loops

OLD:

for i in xrange

NEW:

for i in range

Printing

OLD:

print "hello world"

NEW:

print("hello world")

Dictionary iteration

OLD:

for k, v in d.iteritems():

NEW:

for k, v in d.items():

COMPATIBLE WITH BOTH:

from future.utils import iteritems
for k, v in iteritems(d):

Since I am still busy hacking away at my next course, we are going to do another HUGE sale. ALL courses on Udemy are now $12. Take this opportunity to grab as many courses as you can because you never know when the next sale is going to be!

As usual, I’m providing $12 coupons for all my courses in the links below. Please use these links and share them with your friends!

You can also just type in the coupon code “SEP123”.

The promo goes until September 20. Don’t wait!

At the end of this post, I’m going to provide you with some additional links to get machine learning prerequisites (calculus, linear algebra, Python, etc…) for $12 too!

But that’s not all… I’m the Lazy Programmer, not just the Lazy Data Scientist – I’ve got $12 coupons for iOS development, Android development, Ruby on Rails, Python, Big Data / Hadoop / Spark, React.js, Angular, and MORE. All important skillsets on ANY engineering team. Got any friends or coworkers in mobile / backend / big data development? Let them know!

Into Yoga in your spare time? Photography? Painting? There are courses, and I’ve got coupons! If you find a course on Udemy that you’d like a coupon for, just let me know and I’ll hook you up!

Remember, these links will self-destruct on September 20 (7 days). Act NOW!

BIG DISCOUNTS for everyone! If you’re in the USA you should see $10 coupons. If you’re in another country you’ll see the corresponding amount in your own currency.

But before we get to that, I want to mention that the VIP bonus for my latest Deep Learning course on GANs and Variational Autoencoders is CLOSING TODAY.

So if you want to get the VIP bonus and you haven’t gotten it yet, NOW is the time!

Just a reminder of what you get:

1) PDF cheatsheet / tutorial on Variational Autoencoders for your reading convenience

2) PDF cheatsheet / tutorial on GANs for your reading convenience (with exercises)

3) Pre-trained style transfer network! No need to train for 4 months on your slow CPU, or pay hundreds of dollars to use a GPU, or download 100s of MBs of Tensorflow checkpoint data! I’ve condensed the neural network weights to a few MBs so you can get going right away.

If you don’t know what “style transfer” is – that’s where I train a neural network to learn the “style” of Picasso or Da Vinci, and then apply it to a completely unrelated image like the Chicago skyline.

Very cool application of neural networks!

Remember: these VIP bonuses are ONLY available if you use the VIP coupon, which is automatically applied when you click this link:

Into Yoga in your spare time? Photography? Painting? There are courses, and I’ve got coupons! If you find a course on Udemy that you’d like a coupon for, just let me know and I’ll hook you up!

Remember, these links will self-destruct on August 31 (10 days). Act NOW!

Since I am still busy hacking away at my next course, we are going to do another HUGE sale. ALL courses on Udemy are now $10. Take this opportunity to grab as many courses as you can because you never know when the next sale is going to be!

As usual, I’m providing $10 coupons for all my courses in the links below. Please use these links and share them with your friends!

You can also just type in the coupon code “JUL456”.

The $10 promo doesn’t come around often, so make sure you pick up everything you are interested in, or could become interested in later this year. The promo goes until July 31. Don’t wait!

At the end of this post, I’m going to provide you with some additional links to get machine learning prerequisites (calculus, linear algebra, Python, etc…) for $10 too!