Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай Build a Simple Neural Network & Learn Backpropagation, а также все другие курсы, прямо сейчас!
Премиум
  • Урок 1. 00:03:00
    Introduction
  • Урок 2. 00:06:49
    Introduction to Our Simple Neural Network
  • Урок 3. 00:06:20
    Why We Use Computational Graphs
  • Урок 4. 00:06:56
    Conducting the Forward Pass
  • Урок 5. 00:02:48
    Roadmap to Understanding Backpropagation
  • Урок 6. 00:04:28
    Derivatives Theory
  • Урок 7. 00:13:40
    Numerical Example of Derivatives
  • Урок 8. 00:08:02
    Partial Derivatives
  • Урок 9. 00:03:53
    Gradients
  • Урок 10. 00:10:14
    Understanding What Partial Derivatives DРѕ
  • Урок 11. 00:05:01
    Introduction to Backpropagation
  • Урок 12. 00:07:33
    (Optional) Chain Rule
  • Урок 13. 00:07:37
    Gradient Derivation of Mean Squared Error Loss Function
  • Урок 14. 00:11:39
    Visualizing the Loss Function and Understanding Gradients
  • Урок 15. 00:18:43
    Using the Chain Rule to See how w2 Affects the Final Loss
  • Урок 16. 00:04:30
    Backpropagation of w1
  • Урок 17. 00:10:08
    Introduction to Gradient Descent Visually
  • Урок 18. 00:06:08
    Gradient Descent
  • Урок 19. 00:08:11
    Understanding the Learning Rate (Alpha)
  • Урок 20. 00:05:31
    Moving in the Opposite Direction of the Gradient
  • Урок 21. 00:08:48
    Calculating Gradient Descent by Hand
  • Урок 22. 00:04:24
    Coding our Simple Neural Network Part 1
  • Урок 23. 00:07:17
    Coding our Simple Neural Network Part 2
  • Урок 24. 00:06:32
    Coding our Simple Neural Network Part 3
  • Урок 25. 00:05:01
    Coding our Simple Neural Network Part 4
  • Урок 26. 00:05:23
    Coding our Simple Neural Network Part 5
  • Урок 27. 00:05:30
    Introduction to Our Complex Neural Network
  • Урок 28. 00:04:25
    Conducting the Forward Pass
  • Урок 29. 00:04:52
    Getting Started with Backpropagation
  • Урок 30. 00:07:43
    Getting the Derivative of the Sigmoid Activation Function(Optional)
  • Урок 31. 00:04:55
    Implementing Backpropagation with the Chain Rule
  • Урок 32. 00:06:10
    Understanding How w3 Affects the Final Loss
  • Урок 33. 00:07:43
    Calculating Gradients for Z1
  • Урок 34. 00:04:53
    Understanding How w1 and w2 Affect the Loss
  • Урок 35. 00:08:29
    Implementing Gradient Descent by Hand
  • Урок 36. 00:06:51
    Coding our Advanced Neural Network Part (Implementing Forward Pass + Loss)
  • Урок 37. 00:10:11
    Coding our Advanced Neural Network Part 2 (Implement Backpropagation)
  • Урок 38. 00:05:35
    Coding our Advanced Neural Network Part 3 (Implement Gradient Descent)
  • Урок 39. 00:08:16
    Coding our Advanced Neural Network Part 4 (Training our Neural Network)