Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай Advanced AI: LLMs Explained with Math (Transformers, Attention Mechanisms & More), а также все другие курсы, прямо сейчас!
Премиум
  • Урок 1. 00:03:01
    Advanced AI: LLMs Explained with Math
  • Урок 2. 00:03:22
    Creating Our Optional Experiment Notebook - Part 1
  • Урок 3. 00:04:02
    Creating Our Optional Experiment Notebook - Part 2
  • Урок 4. 00:13:25
    Encoding Categorical Labels to Numeric Values
  • Урок 5. 00:15:06
    Understanding the Tokenization Vocabulary
  • Урок 6. 00:10:57
    Encoding Tokens
  • Урок 7. 00:12:49
    Practical Example of Tokenization and Encoding
  • Урок 8. 00:04:47
    DistilBert vs. Bert Differences
  • Урок 9. 00:07:41
    Embeddings In A Continuous Vector Space
  • Урок 10. 00:05:14
    Introduction To Positional Encodings
  • Урок 11. 00:04:15
    Positional Encodings - Part 1
  • Урок 12. 00:10:11
    Positional Encodings - Part 2 (Even and Odd Indices)
  • Урок 13. 00:05:09
    Why Use Sine and Cosine Functions
  • Урок 14. 00:09:53
    Understanding the Nature of Sine and Cosine Functions
  • Урок 15. 00:09:25
    Visualizing Positional Encodings in Sine and Cosine Graphs
  • Урок 16. 00:18:08
    Solving the Equations to Get the Values for Positional Encodings
  • Урок 17. 00:03:03
    Introduction to Attention Mechanism
  • Урок 18. 00:18:11
    Query, Key and Value Matrix
  • Урок 19. 00:06:54
    Getting Started with Our Step by Step Attention Calculation
  • Урок 20. 00:20:06
    Calculating Key Vectors
  • Урок 21. 00:10:21
    Query Matrix Introduction
  • Урок 22. 00:21:25
    Calculating Raw Attention Scores
  • Урок 23. 00:13:33
    Understanding the Mathematics Behind Dot Products and Vector Alignment
  • Урок 24. 00:05:43
    Visualizing Raw Attention Scores in 2D
  • Урок 25. 00:09:17
    Converting Raw Attention Scores to Probability Distributions with Softmax
  • Урок 26. 00:03:20
    Normalization
  • Урок 27. 00:09:08
    Understanding the Value Matrix and Value Vector
  • Урок 28. 00:10:46
    Calculating the Final Context Aware Rich Representation for the Word "River"
  • Урок 29. 00:01:59
    Understanding the Output
  • Урок 30. 00:11:56
    Understanding Multi Head Attention
  • Урок 31. 00:09:52
    Multi Head Attention Example and Subsequent Layers
  • Урок 32. 00:02:30
    Masked Language Learning