Gradient descent, how neural networks learn | Deep learning, chapter 2

Share it with your friends Like

Thanks! Share it with your friends!

Close

Home page: https://www.3blue1brown.com/
Brought to you by you: http://3b1b.co/nn2-thanks
And by Amplify Partners.

For any early stage ML startup founders, Amplify Partners would love to hear from you via 3blue1brown@amplifypartners.com

To learn more, I highly recommend the book by Michael Nielsen
http://neuralnetworksanddeeplearning.com/
The book walks through the code behind the example in these videos, which you can find here:
https://github.com/mnielsen/neural-networks-and-deep-learning

MNIST database:
http://yann.lecun.com/exdb/mnist/

Also check out Chris Olah’s blog:
http://colah.github.io/
His post on Neural networks and topology is particular beautiful, but honestly all of the stuff there is great.

And if you like that, you’ll *love* the publications at distill:
https://distill.pub/

For more videos, Welch Labs also has some great series on machine learning:
https://youtu.be/i8D90DkCLhI
https://youtu.be/bxe2T-V8XRs

“But I’ve already voraciously consumed Nielsen’s, Olah’s and Welch’s works”, I hear you say. Well well, look at you then. That being the case, I might recommend that you continue on with the book “Deep Learning” by Goodfellow, Bengio, and Courville.

Thanks to Lisha Li (@lishali88) for her contributions at the end, and for letting me pick her brain so much about the material. Here are the articles she referenced at the end:
https://arxiv.org/abs/1611.03530
https://arxiv.org/abs/1706.05394
https://arxiv.org/abs/1412.0233

Music by Vincent Rubinetti:
https://vincerubinetti.bandcamp.com/album/the-music-of-3blue1brown

——————

3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you’re into that).

If you are new to this channel and want to see more, a good place to start is this playlist: http://3b1b.co/recommended

Various social media stuffs:
Website: https://www.3blue1brown.com
Twitter: https://twitter.com/3Blue1Brown
Patreon: https://patreon.com/3blue1brown
Facebook: https://www.facebook.com/3blue1brown
Reddit: https://www.reddit.com/r/3Blue1Brown

Comments

3Blue1Brown says:

Part 3 will be on backpropagation. I had originally planned to include it here, but the more I wanted to dig into a proper walk-through for what it's really doing, the more deserving it became of its own video. Stay tuned!

Belle Cheytac says:

Now you know for sure machines can not feel emotions, but only calculate.

OT A says:

I like all your videos before I watch them

mondlos says:

8:40 haha and thats why schools MUST stop using multiple choice because even ai can do multiple choice

mondlos says:

a neural network is a function. Using a bunch of linear algebra we are able to find some local minimum of the cost function (which evaluates the average cost for every combination of weights and biases in the network, which in turn must be calculated via training data) and thus "train" it to recognize patterns which are similar to what it has seen but not exactly it and giving probabilities so to say, which output should be produced and logically, the network always chooses the output with highest activation.

Yoshi says:

What are …. 0:43

THOSE

Don Branson says:

This is great! Thank you for putting it together!

Hangil Kim says:

"But we can do better! Growth mindset!" at 5:18 …. a wholesome intellectual i love to see it

DaredD3vil says:

This is gold.

Sandeep Singhal says:

This video series makes a lot of sense. Most videos get pretentious as they dive right into jingoism or third party libraries 🙂

Wayne Gu says:

My intelligence and math cannot overcome the AI.

Rednassie says:

I started with neural networks, have written my own handwritten digits network (using keras in Python) and I am very proud of that. Time to learn some more

Keyser Söze says:

Can you please add more videos in this series.
like CNN LSTM GRU etc.

Toni Kaiser says:

Ja aber man kann ja einfach in die erste Ableitung null setzen und dann ausrechnen und die ergebnisse sagen dann wo oder?

EngX Hub says:

Vectorizing a state of a result and then visualizing it to improve is a game changer.

Sharpnova says:

Michel Nielsen's book is unreadable. He presents it as a narrow NARROW column on the far left side of a web page.

I have no idea why he chose to do this. It's quite hideous to read because of that. Otherwise I'd be reading it and donating right now.

Ty Thacker says:

Biological neurons actually are not binary, they produce ranges of voltage, as well as pulse patterns similar to pulse width modulation . Love the video style btw.

Jyothish Ravindran says:

Thanks a lot for this….

John Thane says:

In the 1980's I worked on neural networks. I used genetic algorithms to train neural networks. I created populations of neural networks, mutated their genes (synapses), had them mate and create offspring, and after a while, I would rank them for accuracy and "kill the stupid ones".

Yunfei Chen says:

How bad the computer should feel??? But computers dont have feelings….

Kyssifrot says:

What if we added another eleventh option like "Not a number" and feed some additional data with random pixels on it? Shouldn't it improve the capacity the recognize numbers and only numbers?

Mattijs says:

This dude…He can teach rocket science to monkeys…I'm sure

Write a comment

*