THE FUTURE IS HERE

Neural Networks Pt. 2: Backpropagation Main Ideas

Backpropagation is the method we use to optimize parameters in a Neural Network. The ideas behind backpropagation are quite simple, but there are tons of details. This StatQuest focuses on explaining the main ideas in a way that is easy to understand.

NOTE: This StatQuest assumes that you already know the main ideas behind…
Neural Networks: https://youtu.be/CqOfi41LfDw
The Chain Rule: https://youtu.be/wl1myxrtQHQ
Gradient Descent: https://youtu.be/sDv4f4s2SB8

LAST NOTE: When I was researching this ‘Quest, I found this page by Sebastian Raschka to be helpful: https://sebastianraschka.com/faq/docs/backprop-arbitrary.html

For a complete index of all the StatQuest videos, check out:
https://statquest.org/video-index/

If you’d like to support StatQuest, please consider…

Buying my book, The StatQuest Illustrated Guide to Machine Learning:
PDF – https://statquest.gumroad.com/l/wvtmc
Paperback – https://www.amazon.com/dp/B09ZCKR4H6
Kindle eBook – https://www.amazon.com/dp/B09ZG79HXC

Patreon: https://www.patreon.com/statquest
…or…
YouTube Membership: https://www.youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw/join

…a cool StatQuest t-shirt or sweatshirt:
https://shop.spreadshirt.com/statquest-with-josh-starmer/

…buying one or two of my songs (or go large and get a whole album!)
https://joshuastarmer.bandcamp.com/

…or just donating to StatQuest!
https://www.paypal.me/statquest

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
https://twitter.com/joshuastarmer

0:00 Awesome song and introduction
3:55 Fitting the Neural Network to the data
6:04 The Sum of the Squared Residuals
7:23 Testing different values for a parameter
8:38 Using the Chain Rule to calculate a derivative
13:28 Using Gradient Descent
16:05 Summary

#StatQuest #NeuralNetworks #Backpropagation