Previous: W1, Next: W3

# Summary

📗 Examples and quizzes: E2
📗 Math homework: M3
📗 Programming homework: part 2 of P1
📗 No lecture.
📗 Tuesday programming office hours: 5:00 to 6:00, Java Guest Link, Python Guest Link (same for the first two weeks)
📗 Wednesday math homework office hours: 5:00 to 6:00, Guest Link
📗 Thursday math homework office hours: 5:00 to 6:00 (not the first two weeks)
📗 Friday office hours for other things: 5:00 to 6:00, Guest Link

# Lectures

📗 Slides
Lecture 3: Slides.
Lecture 4: Slides.

📗 Videos
Lecture 3 Part 1 (Neural Network): Link
Lecture 3 Part 2 (Backpropagation): Link
Lecture 3 Part 3 (Multi-Layer Network): Link
Lecture 4 Part 1 (Stochastic Gradient): Link
Lecture 4 Part 2 (Multi-Class Classification): Link
Lecture 4 Part 3 (Regularization): Link

📗 Notes
I recorded a video talking about P1 and how it is graded: Link. In case you are curious, I explained how the auto-grading scripts (JavaScript) grade P1 and M1. You do not have to watch it to solve P1.
The video series by 3Blue1Brown on Neural Networks are really good: Playlist

# Other Materials

📗 Relevant websites
(from week 1) Gradient Descent: Link
Neural Network: Link
Neural Network MNIST Visualization: Link
Neural Network MNIST Demo: Link
Neural Network Videos by Grant Sanderson: Playlist (Thanks Dan Drake for the recommendation)
Stochastic Gradient Descent: Link
Overfitting: Link
Neural Network Snake: Link
Neural Network Car: Link
Neural Network Flappy Bird: Link
Neural Network Mario: Link

Overfitting

📗 YouTube videos from 2019
How to construct XOR network? Link
How derive 2-layer neural network gradient descent step? Link
How derive multi-layer neural network gradient descent induction step? Link
Comparison between L1 and L2 regularization. Link
Example (Quiz): Cross validation accuracy Link





Last Updated: November 09, 2021 at 12:30 AM