Урок 1. 00:06:53
Introduction (Hugging Face Ecosystem and Text Classification)
Урок 2. 00:04:41
More Text Classification Examples
Урок 3. 00:07:22
What We're Going To Build!
Урок 4. 00:05:53
Getting Setup: Adding Hugging Face Tokens to Google Colab
Урок 5. 00:09:36
Getting Setup: Importing Necessary Libraries to Google Colab
Урок 6. 00:16:01
Downloading a Text Classification Dataset from Hugging Face Datasets
Урок 7. 00:12:49
Preparing Text Data for Use with a Model - Part 1: Turning Our Labels into Numbers
Урок 8. 00:06:19
Preparing Text Data for Use with a Model - Part 2: Creating Train and Test Sets
Урок 9. 00:12:54
Preparing Text Data for Use with a Model - Part 3: Getting a Tokenizer
Урок 10. 00:10:27
Preparing Text Data for Use with a Model - Part 4: Exploring Our Tokenizer
Урок 11. 00:17:58
Preparing Text Data for Use with a Model - Part 5: Creating a Function to Tokenize Our Data
Урок 12. 00:08:54
Setting Up an Evaluation Metric (to measure how well our model performs)
Урок 13. 00:07:11
Introduction to Transfer Learning (a powerful technique to get good results quickly)
Урок 14. 00:12:20
Model Training - Part 1: Setting Up a Pretrained Model from the Hugging Face Hub
Урок 15. 00:12:27
Model Training - Part 2: Counting the Parameters in Our Model
Урок 16. 00:03:54
Model Training - Part 3: Creating a Folder to Save Our Model
Урок 17. 00:15:00
Model Training - Part 4: Setting Up Our Training Arguments with TrainingArguments
Урок 18. 00:05:06
Model Training - Part 5: Setting Up an Instance of Trainer with Hugging Face Transformers
Урок 19. 00:13:35
Model Training - Part 6: Training Our Model and Fixing Errors Along the Way
Урок 20. 00:14:40
Model Training - Part 7: Inspecting Our Models Loss Curves
Урок 21. 00:08:02
Model Training - Part 8: Uploading Our Model to the Hugging Face Hub
Урок 22. 00:05:59
Making Predictions on the Test Data with Our Trained Model
Урок 23. 00:12:49
Turning Our Predictions into Prediction Probabilities with PyTorch
Урок 24. 00:05:11
Sorting Our Model's Predictions by Their Probability
Урок 25. 00:09:41
Performing Inference - Part 1: Discussing Our Options
Урок 26. 00:10:02
Performing Inference - Part 2: Using a Transformers Pipeline (one sample at a time)
Урок 27. 00:06:39
Performing Inference - Part 3: Using a Transformers Pipeline on Multiple Samples at a Time (Batching)
Урок 28. 00:10:34
Performing Inference - Part 4: Running Speed Tests to Compare One at a Time vs. Batched Predictions
Урок 29. 00:12:07
Performing Inference - Part 5: Performing Inference with PyTorch
Урок 30. 00:34:29
OPTIONAL - Putting It All Together: from Data Loading, to Model Training, to making Predictions on Custom Data
Урок 31. 00:03:48
Turning Our Model into a Demo - Part 1: Gradio Overview
Урок 32. 00:07:08
Turning Our Model into a Demo - Part 2: Building a Function to Map Inputs to Outputs
Урок 33. 00:06:47
Turning Our Model into a Demo - Part 3: Getting Our Gradio Demo Running Locally
Урок 34. 00:08:02
Making Our Demo Publicly Accessible - Part 1: Introduction to Hugging Face Spaces and Creating a Demos Directory
Урок 35. 00:12:15
Making Our Demo Publicly Accessible - Part 2: Creating an App File
Урок 36. 00:07:08
Making Our Demo Publicly Accessible - Part 3: Creating a README File
Урок 37. 00:03:34
Making Our Demo Publicly Accessible - Part 4: Making a Requirements File
Урок 38. 00:18:44
Making Our Demo Publicly Accessible - Part 5: Uploading Our Demo to Hugging Face Spaces and Making it Publicly Available
Урок 39. 00:05:56
Summary Exercises and Extensions