Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай Build a Simple Neural Network & Learn Backpropagation, а также все другие курсы, прямо сейчас!
Премиум
  1. Урок 1. 00:03:00
    Introduction
  2. Урок 2. 00:06:49
    Introduction to Our Simple Neural Network
  3. Урок 3. 00:06:20
    Why We Use Computational Graphs
  4. Урок 4. 00:06:56
    Conducting the Forward Pass
  5. Урок 5. 00:02:48
    Roadmap to Understanding Backpropagation
  6. Урок 6. 00:04:28
    Derivatives Theory
  7. Урок 7. 00:13:40
    Numerical Example of Derivatives
  8. Урок 8. 00:08:02
    Partial Derivatives
  9. Урок 9. 00:03:53
    Gradients
  10. Урок 10. 00:10:14
    Understanding What Partial Derivatives DРѕ
  11. Урок 11. 00:05:01
    Introduction to Backpropagation
  12. Урок 12. 00:07:33
    (Optional) Chain Rule
  13. Урок 13. 00:07:37
    Gradient Derivation of Mean Squared Error Loss Function
  14. Урок 14. 00:11:39
    Visualizing the Loss Function and Understanding Gradients
  15. Урок 15. 00:18:43
    Using the Chain Rule to See how w2 Affects the Final Loss
  16. Урок 16. 00:04:30
    Backpropagation of w1
  17. Урок 17. 00:10:08
    Introduction to Gradient Descent Visually
  18. Урок 18. 00:06:08
    Gradient Descent
  19. Урок 19. 00:08:11
    Understanding the Learning Rate (Alpha)
  20. Урок 20. 00:05:31
    Moving in the Opposite Direction of the Gradient
  21. Урок 21. 00:08:48
    Calculating Gradient Descent by Hand
  22. Урок 22. 00:04:24
    Coding our Simple Neural Network Part 1
  23. Урок 23. 00:07:17
    Coding our Simple Neural Network Part 2
  24. Урок 24. 00:06:32
    Coding our Simple Neural Network Part 3
  25. Урок 25. 00:05:01
    Coding our Simple Neural Network Part 4
  26. Урок 26. 00:05:23
    Coding our Simple Neural Network Part 5
  27. Урок 27. 00:05:30
    Introduction to Our Complex Neural Network
  28. Урок 28. 00:04:25
    Conducting the Forward Pass
  29. Урок 29. 00:04:52
    Getting Started with Backpropagation
  30. Урок 30. 00:07:43
    Getting the Derivative of the Sigmoid Activation Function(Optional)
  31. Урок 31. 00:04:55
    Implementing Backpropagation with the Chain Rule
  32. Урок 32. 00:06:10
    Understanding How w3 Affects the Final Loss
  33. Урок 33. 00:07:43
    Calculating Gradients for Z1
  34. Урок 34. 00:04:53
    Understanding How w1 and w2 Affect the Loss
  35. Урок 35. 00:08:29
    Implementing Gradient Descent by Hand
  36. Урок 36. 00:06:51
    Coding our Advanced Neural Network Part (Implementing Forward Pass + Loss)
  37. Урок 37. 00:10:11
    Coding our Advanced Neural Network Part 2 (Implement Backpropagation)
  38. Урок 38. 00:05:35
    Coding our Advanced Neural Network Part 3 (Implement Gradient Descent)
  39. Урок 39. 00:08:16
    Coding our Advanced Neural Network Part 4 (Training our Neural Network)