Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай AI Engineering Bootcamp: Build, Train & Deploy Models with AWS SageMaker, а также все другие курсы, прямо сейчас!
Премиум
  1. Урок 1. 00:01:36
    AI Engineering Bootcamp: Learn AWS SageMaker with Patrik Szepesi
  2. Урок 2. 00:08:43
    Course Introduction
  3. Урок 3. 00:04:32
    Setting Up Our AWS Account
  4. Урок 4. 00:07:40
    Set Up IAM Roles + Best Practices
  5. Урок 5. 00:07:02
    AWS Security Best Practices
  6. Урок 6. 00:02:23
    Set Up AWS SageMaker Domain
  7. Урок 7. 00:00:43
    UI Domain Change
  8. Урок 8. 00:05:09
    Setting Up SageMaker Environment
  9. Урок 9. 00:08:45
    SageMaker Studio and Pricing
  10. Урок 10. 00:06:09
    Setup: SageMaker Server + PyTorch
  11. Урок 11. 00:18:35
    HuggingFace Models, Sentiment Analysis, and AutoScaling
  12. Урок 12. 00:06:04
    Get Dataset for Multiclass Text Classification
  13. Урок 13. 00:03:53
    Creating Our AWS S3 Bucket
  14. Урок 14. 00:01:27
    Uploading Our Training Data to S3
  15. Урок 15. 00:13:22
    Exploratory Data Analysis - Part 1
  16. Урок 16. 00:06:08
    Exploratory Data Analysis - Part 2
  17. Урок 17. 00:11:09
    Data Visualization and Best Practices
  18. Урок 18. 00:18:25
    Setting Up Our Training Job Notebook + Reasons to Use SageMaker
  19. Урок 19. 00:13:37
    Python Script for HuggingFace Estimator
  20. Урок 20. 00:03:22
    Creating Our Optional Experiment Notebook - Part 1
  21. Урок 21. 00:04:02
    Creating Our Optional Experiment Notebook - Part 2
  22. Урок 22. 00:13:25
    Encoding Categorical Labels to Numeric Values
  23. Урок 23. 00:15:06
    Understanding the Tokenization Vocabulary
  24. Урок 24. 00:10:57
    Encoding Tokens
  25. Урок 25. 00:12:49
    Practical Example of Tokenization and Encoding
  26. Урок 26. 00:16:57
    Creating Our Dataset Loader Class
  27. Урок 27. 00:15:10
    Setting Pytorch DataLoader
  28. Урок 28. 00:01:32
    Which Path Will You Take?
  29. Урок 29. 00:04:47
    DistilBert vs. Bert Differences
  30. Урок 30. 00:07:41
    Embeddings In A Continuous Vector Space
  31. Урок 31. 00:05:14
    Introduction To Positional Encodings
  32. Урок 32. 00:04:15
    Positional Encodings - Part 1
  33. Урок 33. 00:10:11
    Positional Encodings - Part 2 (Even and Odd Indices)
  34. Урок 34. 00:05:09
    Why Use Sine and Cosine Functions
  35. Урок 35. 00:09:53
    Understanding the Nature of Sine and Cosine Functions
  36. Урок 36. 00:09:25
    Visualizing Positional Encodings in Sine and Cosine Graphs
  37. Урок 37. 00:18:08
    Solving the Equations to Get the Values for Positional Encodings
  38. Урок 38. 00:03:03
    Introduction to Attention Mechanism
  39. Урок 39. 00:18:11
    Query, Key and Value Matrix
  40. Урок 40. 00:06:54
    Getting Started with Our Step by Step Attention Calculation
  41. Урок 41. 00:20:06
    Calculating Key Vectors
  42. Урок 42. 00:10:21
    Query Matrix Introduction
  43. Урок 43. 00:21:25
    Calculating Raw Attention Scores
  44. Урок 44. 00:13:33
    Understanding the Mathematics Behind Dot Products and Vector Alignment
  45. Урок 45. 00:05:43
    Visualizing Raw Attention Scores in 2D
  46. Урок 46. 00:09:17
    Converting Raw Attention Scores to Probability Distributions with Softmax
  47. Урок 47. 00:03:20
    Normalization
  48. Урок 48. 00:09:08
    Understanding the Value Matrix and Value Vector
  49. Урок 49. 00:10:46
    Calculating the Final Context Aware Rich Representation for the Word "River"
  50. Урок 50. 00:01:59
    Understanding the Output
  51. Урок 51. 00:11:56
    Understanding Multi Head Attention
  52. Урок 52. 00:09:52
    Multi Head Attention Example and Subsequent Layers
  53. Урок 53. 00:02:30
    Masked Language Learning
  54. Урок 54. 00:02:57
    Exercise: Imposter Syndrome
  55. Урок 55. 00:17:15
    Creating Our Custom Model Architecture with PyTorch
  56. Урок 56. 00:15:32
    Adding the Dropout, Linear Layer, and ReLU to Our Model
  57. Урок 57. 00:13:05
    Creating Our Accuracy Function
  58. Урок 58. 00:19:09
    Creating Our Train Function
  59. Урок 59. 00:08:18
    Finishing Our Train Function
  60. Урок 60. 00:13:41
    Setting Up the Validation Function
  61. Урок 61. 00:04:06
    Passing Parameters In SageMaker
  62. Урок 62. 00:04:28
    Setting Up Model Parameters For Training
  63. Урок 63. 00:05:40
    Understanding The Mathematics Behind Cross Entropy Loss
  64. Урок 64. 00:06:57
    Finishing Our Script.py File
  65. Урок 65. 00:07:36
    Quota Increase
  66. Урок 66. 00:08:16
    Starting Our Training Job
  67. Урок 67. 00:14:17
    Debugging Our Training Job With AWS CloudWatch
  68. Урок 68. 00:05:47
    Analyzing Our Training Job Results
  69. Урок 69. 00:08:35
    Creating Our Inference Script For Our PyTorch Model
  70. Урок 70. 00:09:13
    Finishing Our PyTorch Inference Script
  71. Урок 71. 00:07:31
    Setting Up Our Deployment
  72. Урок 72. 00:08:55
    Deploying Our Model To A SageMaker Endpoint
  73. Урок 73. 00:04:20
    Introduction to Endpoint Load Testing
  74. Урок 74. 00:10:03
    Creating Our Test Data for Load Testing
  75. Урок 75. 00:01:04
    Upload Testing Data to S3
  76. Урок 76. 00:03:59
    Creating Our Model for Load Testing
  77. Урок 77. 00:07:15
    Starting Our Load Test Job
  78. Урок 78. 00:10:17
    Analyze Load Test Results
  79. Урок 79. 00:03:51
    Deploying Our Endpoint
  80. Урок 80. 00:10:27
    Creating Lambda Function to Call Our Endpoint
  81. Урок 81. 00:05:28
    Setting Up Our AWS API Gateway
  82. Урок 82. 00:05:40
    Testing Our Model with Postman, API Gateway and Lambda
  83. Урок 83. 00:02:52
    Cleaning Up Resources
  84. Урок 84. 00:01:18
    Thank You!