MIT Introduction to Deep Learning | 6.S191
MIT Introduction to Deep Learning 6.S191: Lecture 1
*New 2024 Edition*
Foundations of Deep Learning
Lecturer: Alexander Amini
For all lectures, slides, and lab materials: http://introtodeeplearning.com/
Lecture Outline
0:00 - Introduction
7:25 - Course information
13:37 - Why deep learning?
17:20 - The perceptron
24:30 - Perceptron example
31;16 - From perceptrons to neural networks
37:51 - Applying neural networks
41:12 - Loss functions
44:22 - Training and gradient descent
49:52 - Backpropagation
54:57 - Setting the learning rate
58:54 - Batched gradient descent
1:02:28 - Regularization: dropout and early stopping
1:08:47 - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!
MIT Introduction to Deep Learning 6.S191: Lecture 1
*New 2024 Edition*
Foundations of Deep Learning
Lecturer: Alexander Amini
For all lectures, slides, and lab materials: http://introtodeeplearning.com/
Lecture Outline
0:00 – Introduction
7:25 – Course information
13:37 – Why deep learning?
17:20 – The perceptron
24:30 – Perceptron example
31;16 – From perceptrons to neural networks
37:51 – Applying neural networks
41:12 – Loss functions
44:22 – Training and gradient descent
49:52 – Backpropagation
54:57 – Setting the learning rate
58:54 – Batched gradient descent
1:02:28 – Regularization: dropout and early stopping
1:08:47 – Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!