Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай AI Engineering Bootcamp: Build, Train & Deploy Models with AWS SageMaker, а также все другие курсы, прямо сейчас!
Премиум
  • Урок 1. 00:01:36
    AI Engineering Bootcamp: Learn AWS SageMaker with Patrik Szepesi
  • Урок 2. 00:08:43
    Course Introduction
  • Урок 3. 00:04:32
    Setting Up Our AWS Account
  • Урок 4. 00:07:40
    Set Up IAM Roles + Best Practices
  • Урок 5. 00:07:02
    AWS Security Best Practices
  • Урок 6. 00:02:23
    Set Up AWS SageMaker Domain
  • Урок 7. 00:00:43
    UI Domain Change
  • Урок 8. 00:05:09
    Setting Up SageMaker Environment
  • Урок 9. 00:08:45
    SageMaker Studio and Pricing
  • Урок 10. 00:06:09
    Setup: SageMaker Server + PyTorch
  • Урок 11. 00:18:35
    HuggingFace Models, Sentiment Analysis, and AutoScaling
  • Урок 12. 00:06:04
    Get Dataset for Multiclass Text Classification
  • Урок 13. 00:03:53
    Creating Our AWS S3 Bucket
  • Урок 14. 00:01:27
    Uploading Our Training Data to S3
  • Урок 15. 00:13:22
    Exploratory Data Analysis - Part 1
  • Урок 16. 00:06:08
    Exploratory Data Analysis - Part 2
  • Урок 17. 00:11:09
    Data Visualization and Best Practices
  • Урок 18. 00:18:25
    Setting Up Our Training Job Notebook + Reasons to Use SageMaker
  • Урок 19. 00:13:37
    Python Script for HuggingFace Estimator
  • Урок 20. 00:03:22
    Creating Our Optional Experiment Notebook - Part 1
  • Урок 21. 00:04:02
    Creating Our Optional Experiment Notebook - Part 2
  • Урок 22. 00:13:25
    Encoding Categorical Labels to Numeric Values
  • Урок 23. 00:15:06
    Understanding the Tokenization Vocabulary
  • Урок 24. 00:10:57
    Encoding Tokens
  • Урок 25. 00:12:49
    Practical Example of Tokenization and Encoding
  • Урок 26. 00:16:57
    Creating Our Dataset Loader Class
  • Урок 27. 00:15:10
    Setting Pytorch DataLoader
  • Урок 28. 00:01:32
    Which Path Will You Take?
  • Урок 29. 00:04:47
    DistilBert vs. Bert Differences
  • Урок 30. 00:07:41
    Embeddings In A Continuous Vector Space
  • Урок 31. 00:05:14
    Introduction To Positional Encodings
  • Урок 32. 00:04:15
    Positional Encodings - Part 1
  • Урок 33. 00:10:11
    Positional Encodings - Part 2 (Even and Odd Indices)
  • Урок 34. 00:05:09
    Why Use Sine and Cosine Functions
  • Урок 35. 00:09:53
    Understanding the Nature of Sine and Cosine Functions
  • Урок 36. 00:09:25
    Visualizing Positional Encodings in Sine and Cosine Graphs
  • Урок 37. 00:18:08
    Solving the Equations to Get the Values for Positional Encodings
  • Урок 38. 00:03:03
    Introduction to Attention Mechanism
  • Урок 39. 00:18:11
    Query, Key and Value Matrix
  • Урок 40. 00:06:54
    Getting Started with Our Step by Step Attention Calculation
  • Урок 41. 00:20:06
    Calculating Key Vectors
  • Урок 42. 00:10:21
    Query Matrix Introduction
  • Урок 43. 00:21:25
    Calculating Raw Attention Scores
  • Урок 44. 00:13:33
    Understanding the Mathematics Behind Dot Products and Vector Alignment
  • Урок 45. 00:05:43
    Visualizing Raw Attention Scores in 2D
  • Урок 46. 00:09:17
    Converting Raw Attention Scores to Probability Distributions with Softmax
  • Урок 47. 00:03:20
    Normalization
  • Урок 48. 00:09:08
    Understanding the Value Matrix and Value Vector
  • Урок 49. 00:10:46
    Calculating the Final Context Aware Rich Representation for the Word "River"
  • Урок 50. 00:01:59
    Understanding the Output
  • Урок 51. 00:11:56
    Understanding Multi Head Attention
  • Урок 52. 00:09:52
    Multi Head Attention Example and Subsequent Layers
  • Урок 53. 00:02:30
    Masked Language Learning
  • Урок 54. 00:02:57
    Exercise: Imposter Syndrome
  • Урок 55. 00:17:15
    Creating Our Custom Model Architecture with PyTorch
  • Урок 56. 00:15:32
    Adding the Dropout, Linear Layer, and ReLU to Our Model
  • Урок 57. 00:13:05
    Creating Our Accuracy Function
  • Урок 58. 00:19:09
    Creating Our Train Function
  • Урок 59. 00:08:18
    Finishing Our Train Function
  • Урок 60. 00:13:41
    Setting Up the Validation Function
  • Урок 61. 00:04:06
    Passing Parameters In SageMaker
  • Урок 62. 00:04:28
    Setting Up Model Parameters For Training
  • Урок 63. 00:05:40
    Understanding The Mathematics Behind Cross Entropy Loss
  • Урок 64. 00:06:57
    Finishing Our Script.py File
  • Урок 65. 00:07:36
    Quota Increase
  • Урок 66. 00:08:16
    Starting Our Training Job
  • Урок 67. 00:14:17
    Debugging Our Training Job With AWS CloudWatch
  • Урок 68. 00:05:47
    Analyzing Our Training Job Results
  • Урок 69. 00:08:35
    Creating Our Inference Script For Our PyTorch Model
  • Урок 70. 00:09:13
    Finishing Our PyTorch Inference Script
  • Урок 71. 00:07:31
    Setting Up Our Deployment
  • Урок 72. 00:08:55
    Deploying Our Model To A SageMaker Endpoint
  • Урок 73. 00:04:20
    Introduction to Endpoint Load Testing
  • Урок 74. 00:10:03
    Creating Our Test Data for Load Testing
  • Урок 75. 00:01:04
    Upload Testing Data to S3
  • Урок 76. 00:03:59
    Creating Our Model for Load Testing
  • Урок 77. 00:07:15
    Starting Our Load Test Job
  • Урок 78. 00:10:17
    Analyze Load Test Results
  • Урок 79. 00:03:51
    Deploying Our Endpoint
  • Урок 80. 00:10:27
    Creating Lambda Function to Call Our Endpoint
  • Урок 81. 00:05:28
    Setting Up Our AWS API Gateway
  • Урок 82. 00:05:40
    Testing Our Model with Postman, API Gateway and Lambda
  • Урок 83. 00:02:52
    Cleaning Up Resources
  • Урок 84. 00:01:18
    Thank You!