CS231n Winter 2016: Lecture 4: Backpropagation, Neural Networks 1

Share it with your friends Like

Thanks! Share it with your friends!

Close

Stanford Winter Quarter 2016 class: CS231n: Convolutional Neural Networks for Visual Recognition. Lecture 4.

Get in touch on Twitter @cs231n, or on Reddit /r/cs231n.

Comments

vocdex says:

This was very satisfying piece of information. Especially, the follow-up questions helped me clear the doubt about many details. Thank you so much

saurabh says:

Thanks for the great lecture.

Clyde Xu says:

35:10, thought dz/dx = y, for each unit of dx change, dz changes by y times. why the equation says dx = self.y * dz?

Vatsal Desai says:

When such an excellent piece of knowledge shared by Andrej — still there are some idiots who have disliked this great piece.

Cinderella Man says:

Thank you Andrej.

C T says:

Finally, a lecture that lifts the curtain of mystery on backward propagation! Excellent delivery of core concepts in neural networks. This is the best lecture I have watched on the topic so far (including the most popular ones that are heavily promoted on public domains!)

SAtIsh Kumar says:

great explanation, helps me to understand backpropagation better. thanks a lot.

Sayma Shammi says:

Where can I find the lecture slides? The lectures are too good!

aneta kufova says:

Best explanation about backprop and neural networks in general I've ever seen. Also good questions from the students (I like how interactive they are)

Lei Xun says:

My takeaways:
1. Computational graph and backpropagation examples 2:55
2. Implementation: forward/backward API 32:37
3. Vectorized implementation 43:26
4. Summary so far 50:50
5. Neural Networks 52:35
6. Summary 1:16:05

Arnab Mukherjee says:

Excellent explanation of backpropagation through computational graphs

Write a comment

*

Area 51
Ringing

Answer