Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай TensorFlow Developer Certificate in 2023: Zero to Mastery, а также все другие курсы, прямо сейчас!
Премиум
  • Урок 1. 00:05:22
    Course Outline
  • Урок 2. 00:04:39
    What is deep learning?
  • Урок 3. 00:09:39
    Why use deep learning?
  • Урок 4. 00:10:27
    What are neural networks?
  • Урок 5. 00:08:37
    What is deep learning already being used for?
  • Урок 6. 00:07:57
    What is and why use TensorFlow?
  • Урок 7. 00:03:38
    What is a Tensor?
  • Урок 8. 00:04:30
    What we're going to cover throughout the course
  • Урок 9. 00:05:34
    How to approach this course
  • Урок 10. 00:18:46
    Creating your first tensors with TensorFlow and tf.constant()
  • Урок 11. 00:07:08
    Creating tensors with TensorFlow and tf.Variable()
  • Урок 12. 00:09:41
    Creating random tensors with TensorFlow
  • Урок 13. 00:09:41
    Shuffling the order of tensors
  • Урок 14. 00:11:56
    Creating tensors from NumPy arrays
  • Урок 15. 00:11:58
    Getting information from your tensors (tensor attributes
  • Урок 16. 00:12:34
    Indexing and expanding tensors
  • Урок 17. 00:05:35
    Manipulating tensors with basic operations
  • Урок 18. 00:11:54
    Matrix multiplication with tensors part 1
  • Урок 19. 00:13:30
    Matrix multiplication with tensors part 2
  • Урок 20. 00:10:04
    Matrix multiplication with tensors part 3
  • Урок 21. 00:06:56
    Changing the datatype of tensors
  • Урок 22. 00:09:50
    Tensor aggregation (finding the min, max, mean & more)
  • Урок 23. 00:06:14
    Tensor troubleshooting example (updating tensor datatypes)
  • Урок 24. 00:09:32
    Finding the positional minimum and maximum of a tensor (argmin and argmax) (9:31)
  • Урок 25. 00:03:00
    Squeezing a tensor (removing all 1-dimension axes)
  • Урок 26. 00:05:47
    One-hot encoding tensors
  • Урок 27. 00:04:48
    Trying out more tensor math operations
  • Урок 28. 00:05:44
    Exploring TensorFlow and NumPy's compatibility
  • Урок 29. 00:10:20
    Making sure our tensor operations run really fast on GPUs
  • Урок 30. 00:07:34
    Introduction to Neural Network Regression with TensorFlow
  • Урок 31. 00:09:00
    Inputs and outputs of a neural network regression model
  • Урок 32. 00:07:56
    Anatomy and architecture of a neural network regression model
  • Урок 33. 00:12:47
    Creating sample regression data (so we can model it)
  • Урок 34. 00:20:16
    The major steps in modelling with TensorFlow
  • Урок 35. 00:06:03
    Steps in improving a model with TensorFlow part 1
  • Урок 36. 00:09:26
    Steps in improving a model with TensorFlow part 2
  • Урок 37. 00:12:34
    Steps in improving a model with TensorFlow part 3
  • Урок 38. 00:07:25
    Evaluating a TensorFlow model part 1 ("visualise, visualise, visualise")
  • Урок 39. 00:11:02
    Evaluating a TensorFlow model part 2 (the three datasets)
  • Урок 40. 00:17:19
    Evaluating a TensorFlow model part 3 (getting a model summary)
  • Урок 41. 00:07:15
    Evaluating a TensorFlow model part 4 (visualising a model's layers)
  • Урок 42. 00:09:17
    Evaluating a TensorFlow model part 5 (visualising a model's predictions)
  • Урок 43. 00:08:06
    Evaluating a TensorFlow model part 6 (common regression evaluation metrics)
  • Урок 44. 00:05:53
    Evaluating a TensorFlow regression model part 7 (mean absolute error)
  • Урок 45. 00:03:19
    Evaluating a TensorFlow regression model part 7 (mean square error)
  • Урок 46. 00:13:51
    Setting up TensorFlow modelling experiments part 1 (start with a simple model)
  • Урок 47. 00:11:30
    Setting up TensorFlow modelling experiments part 2 (increasing complexity)
  • Урок 48. 00:10:21
    Comparing and tracking your TensorFlow modelling experiments
  • Урок 49. 00:08:20
    How to save a TensorFlow model
  • Урок 50. 00:10:16
    How to load and use a saved TensorFlow model
  • Урок 51. 00:06:19
    (Optional) How to save and download files from Google Colab
  • Урок 52. 00:13:32
    Putting together what we've learned part 1 (preparing a dataset)
  • Урок 53. 00:13:21
    Putting together what we've learned part 2 (building a regression model)
  • Урок 54. 00:15:48
    Putting together what we've learned part 3 (improving our regression model)
  • Урок 55. 00:09:35
    Preprocessing data with feature scaling part 1 (what is feature scaling?)
  • Урок 56. 00:10:58
    Preprocessing data with feature scaling part 2 (normalising our data)
  • Урок 57. 00:07:41
    Preprocessing data with feature scaling part 3 (fitting a model on scaled data)
  • Урок 58. 00:08:26
    Introduction to neural network classification in TensorFlow
  • Урок 59. 00:06:39
    Example classification problems (and their inputs and outputs)
  • Урок 60. 00:06:22
    Input and output tensors of classification problems
  • Урок 61. 00:09:37
    Typical architecture of neural network classification models with TensorFlow
  • Урок 62. 00:11:35
    Creating and viewing classification data to model
  • Урок 63. 00:04:39
    Checking the input and output shapes of our classification data
  • Урок 64. 00:12:11
    Building a not very good classification model with TensorFlow
  • Урок 65. 00:09:14
    Trying to improve our not very good classification model
  • Урок 66. 00:15:09
    Creating a function to view our model's not so good predictions
  • Урок 67. 00:12:19
    Make our poor classification model work for a regression dataset
  • Урок 68. 00:09:39
    Non-linearity part 1: Straight lines and non-straight lines
  • Урок 69. 00:05:48
    Non-linearity part 2: Building our first neural network with non-linearity
  • Урок 70. 00:10:19
    Non-linearity part 3: Upgrading our non-linear model with more layers
  • Урок 71. 00:08:38
    Non-linearity part 4: Modelling our non-linear data once and for all
  • Урок 72. 00:14:27
    Non-linearity part 5: Replicating non-linear activation functions from scratch
  • Урок 73. 00:14:48
    Getting great results in less time by tweaking the learning rate
  • Урок 74. 00:06:12
    Using the TensorFlow History object to plot a model's loss curves
  • Урок 75. 00:17:33
    Using callbacks to find a model's ideal learning rate
  • Урок 76. 00:09:21
    Training and evaluating a model with an ideal learning rate
  • Урок 77. 00:06:05
    Introducing more classification evaluation methods
  • Урок 78. 00:04:18
    Finding the accuracy of our classification model
  • Урок 79. 00:08:28
    Creating our first confusion matrix (to see where our model is getting confused)
  • Урок 80. 00:14:01
    Making our confusion matrix prettier
  • Урок 81. 00:10:38
    Putting things together with multi-class classification part 1: Getting the data
  • Урок 82. 00:07:08
    Multi-class classification part 2: Becoming one with the data
  • Урок 83. 00:15:39
    Multi-class classification part 3: Building a multi-class classification model
  • Урок 84. 00:12:44
    Multi-class classification part 4: Improving performance with normalisation
  • Урок 85. 00:04:14
    Multi-class classification part 5: Comparing normalised and non-normalised data
  • Урок 86. 00:10:39
    Multi-class classification part 6: Finding the ideal learning rate
  • Урок 87. 00:13:17
    Multi-class classification part 7: Evaluating our model
  • Урок 88. 00:04:27
    Multi-class classification part 8: Creating a confusion matrix
  • Урок 89. 00:10:43
    Multi-class classification part 9: Visualising random model predictions
  • Урок 90. 00:15:34
    What "patterns" is our model learning?
  • Урок 91. 00:09:37
    Introduction to Computer Vision with TensorFlow
  • Урок 92. 00:08:00
    Introduction to Convolutional Neural Networks (CNNs) with TensorFlow
  • Урок 93. 00:08:28
    Downloading an image dataset for our first Food Vision model
  • Урок 94. 00:05:06
    Becoming One With Data
  • Урок 95. 00:12:27
    Becoming One With Data Part 2
  • Урок 96. 00:04:23
    Becoming One With Data Part 3
  • Урок 97. 00:18:18
    Building an end to end CNN Model
  • Урок 98. 00:09:18
    Using a GPU to run our CNN model 5x faster
  • Урок 99. 00:08:52
    Trying a non-CNN model on our image data
  • Урок 100. 00:09:53
    Improving our non-CNN model by adding more layers
  • Урок 101. 00:09:04
    Breaking our CNN model down part 1: Becoming one with the data
  • Урок 102. 00:11:47
    Breaking our CNN model down part 2: Preparing to load our data
  • Урок 103. 00:09:55
    Breaking our CNN model down part 3: Loading our data with ImageDataGenerator
  • Урок 104. 00:08:03
    Breaking our CNN model down part 4: Building a baseline CNN model
  • Урок 105. 00:15:21
    Breaking our CNN model down part 5: Looking inside a Conv2D layer
  • Урок 106. 00:07:15
    Breaking our CNN model down part 6: Compiling and fitting our baseline CNN
  • Урок 107. 00:11:46
    Breaking our CNN model down part 7: Evaluating our CNN's training curves
  • Урок 108. 00:13:41
    Breaking our CNN model down part 8: Reducing overfitting with Max Pooling
  • Урок 109. 00:06:53
    Breaking our CNN model down part 9: Reducing overfitting with data augmentation
  • Урок 110. 00:15:05
    Breaking our CNN model down part 10: Visualizing our augmented data
  • Урок 111. 00:08:50
    Breaking our CNN model down part 11: Training a CNN model on augmented data
  • Урок 112. 00:10:02
    Breaking our CNN model down part 12: Discovering the power of shuffling data
  • Урок 113. 00:05:22
    Breaking our CNN model down part 13: Exploring options to improve our model
  • Урок 114. 00:04:55
    Downloading a custom image to make predictions on
  • Урок 115. 00:10:01
    Writing a helper function to load and preprocessing custom images
  • Урок 116. 00:10:09
    Making a prediction on a custom image with our trained CNN
  • Урок 117. 00:15:00
    Multi-class CNN's part 1: Becoming one with the data
  • Урок 118. 00:06:39
    Multi-class CNN's part 2: Preparing our data (turning it into tensors)
  • Урок 119. 00:07:25
    Multi-class CNN's part 3: Building a multi-class CNN model
  • Урок 120. 00:06:03
    Multi-class CNN's part 4: Fitting a multi-class CNN model to the data
  • Урок 121. 00:04:52
    Multi-class CNN's part 5: Evaluating our multi-class CNN model (
  • Урок 122. 00:12:20
    Multi-class CNN's part 6: Trying to fix overfitting by removing layers
  • Урок 123. 00:11:47
    Multi-class CNN's part 7: Trying to fix overfitting with data augmentation
  • Урок 124. 00:04:24
    Multi-class CNN's part 8: Things you could do to improve your CNN model
  • Урок 125. 00:09:23
    Multi-class CNN's part 9: Making predictions with our model on custom images
  • Урок 126. 00:06:22
    Saving and loading our trained CNN model
  • Урок 127. 00:10:13
    What is and why use transfer learning?
  • Урок 128. 00:14:40
    Downloading and preparing data for our first transfer learning model
  • Урок 129. 00:10:02
    Introducing Callbacks in TensorFlow and making a callback to track our models
  • Урок 130. 00:09:52
    Exploring the TensorFlow Hub website for pretrained models
  • Урок 131. 00:14:01
    Building and compiling a TensorFlow Hub feature extraction model
  • Урок 132. 00:09:14
    Blowing our previous models out of the water with transfer learning
  • Урок 133. 00:07:36
    Plotting the loss curves of our ResNet feature extraction model
  • Урок 134. 00:09:43
    Building and training a pre-trained EfficientNet model on our data
  • Урок 135. 00:11:41
    Different Types of Transfer Learning
  • Урок 136. 00:15:17
    Comparing Our Model's Results
  • Урок 137. 00:06:17
    Introduction to Transfer Learning in TensorFlow Part 2: Fine-tuning
  • Урок 138. 00:07:36
    Importing a script full of helper functions (and saving lots of space)
  • Урок 139. 00:15:39
    Downloading and turning our images into a TensorFlow BatchDataset
  • Урок 140. 00:02:16
    Discussing the four (actually five) modelling experiments we're running
  • Урок 141. 00:02:35
    Comparing the TensorFlow Keras Sequential API versus the Functional API
  • Урок 142. 00:11:39
    Creating our first model with the TensorFlow Keras Functional API
  • Урок 143. 00:10:54
    Compiling and fitting our first Functional API model
  • Урок 144. 00:13:40
    Getting a feature vector from our trained model
  • Урок 145. 00:03:44
    Drilling into the concept of a feature vector (a learned representation)
  • Урок 146. 00:09:52
    Downloading and preparing the data for Model 1 (1 percent of training data)
  • Урок 147. 00:12:07
    Building a data augmentation layer to use inside our model
  • Урок 148. 00:10:56
    Visualising what happens when images pass through our data augmentation layer
  • Урок 149. 00:15:56
    Building Model 1 (with a data augmentation layer and 1% of training data)
  • Урок 150. 00:16:38
    Building Model 2 (with a data augmentation layer and 10% of training data)
  • Урок 151. 00:07:26
    Creating a ModelCheckpoint to save our model's weights during training
  • Урок 152. 00:07:15
    Fitting and evaluating Model 2 (and saving its weights using ModelCheckpoint)
  • Урок 153. 00:07:18
    Loading and comparing saved weights to our existing trained Model 2
  • Урок 154. 00:20:27
    Preparing Model 3 (our first fine-tuned model)
  • Урок 155. 00:07:46
    Fitting and evaluating Model 3 (our first fine-tuned model)
  • Урок 156. 00:10:27
    Comparing our model's results before and after fine-tuning
  • Урок 157. 00:06:25
    Downloading and preparing data for our biggest experiment yet (Model 4)
  • Урок 158. 00:12:01
    Preparing our final modelling experiment (Model 4)
  • Урок 159. 00:10:20
    Fine-tuning Model 4 on 100% of the training data and evaluating its results
  • Урок 160. 00:10:47
    Comparing our modelling experiment results in TensorBoard
  • Урок 161. 00:02:05
    How to view and delete previous TensorBoard experiments
  • Урок 162. 00:06:20
    Introduction to Transfer Learning Part 3: Scaling Up
  • Урок 163. 00:13:35
    Getting helper functions ready and downloading data to model
  • Урок 164. 00:05:39
    Outlining the model we're going to build and building a ModelCheckpoint callback
  • Урок 165. 00:04:40
    Creating a data augmentation layer to use with our model
  • Урок 166. 00:08:59
    Creating a headless EfficientNetB0 model with data augmentation built in
  • Урок 167. 00:07:57
    Fitting and evaluating our biggest transfer learning model yet
  • Урок 168. 00:11:29
    Unfreezing some layers in our base model to prepare for fine-tuning
  • Урок 169. 00:08:24
    Fine-tuning our feature extraction model and evaluating its performance
  • Урок 170. 00:06:26
    Saving and loading our trained model
  • Урок 171. 00:06:35
    Downloading a pretrained model to make and evaluate predictions with
  • Урок 172. 00:12:47
    Making predictions with our trained model on 25,250 test samples
  • Урок 173. 00:06:06
    Unravelling our test dataset for comparing ground truth labels to predictions
  • Урок 174. 00:05:18
    Confirming our model's predictions are in the same order as the test labels
  • Урок 175. 00:12:08
    Creating a confusion matrix for our model's 101 different classes
  • Урок 176. 00:14:17
    Evaluating every individual class in our dataset
  • Урок 177. 00:07:37
    Plotting our model's F1-scores for each separate class
  • Урок 178. 00:12:09
    Creating a function to load and prepare images for making predictions
  • Урок 179. 00:16:07
    Making predictions on our test images and evaluating them
  • Урок 180. 00:06:10
    Discussing the benefits of finding your model's most wrong predictions
  • Урок 181. 00:11:17
    Writing code to uncover our model's most wrong predictions
  • Урок 182. 00:10:37
    Plotting and visualizing the samples our model got most wrong
  • Урок 183. 00:09:50
    Making predictions on and plotting our own custom images
  • Урок 184. 00:05:45
    Introduction to Milestone Project 1: Food Vision Big™
  • Урок 185. 00:10:18
    Making sure we have access to the right GPU for mixed precision training
  • Урок 186. 00:03:07
    Getting helper functions ready
  • Урок 187. 00:12:04
    Introduction to TensorFlow Datasets (TFDS)
  • Урок 188. 00:15:57
    Exploring and becoming one with the data (Food101 from TensorFlow Datasets)
  • Урок 189. 00:15:51
    Creating a preprocessing function to prepare our data for modelling
  • Урок 190. 00:13:48
    Batching and preparing our datasets (to make them run fast)
  • Урок 191. 00:06:50
    Exploring what happens when we batch and prefetch our data
  • Урок 192. 00:07:15
    Creating modelling callbacks for our feature extraction model
  • Урок 193. 00:10:06
    Turning on mixed precision training with TensorFlow
  • Урок 194. 00:12:43
    Creating a feature extraction model capable of using mixed precision training
  • Урок 195. 00:07:57
    Checking to see if our model is using mixed precision training layer by layer
  • Урок 196. 00:10:20
    Training and evaluating a feature extraction model (Food Vision Big™)
  • Урок 197. 00:07:48
    Introducing your Milestone Project 1 challenge: build a model to beat DeepFood
  • Урок 198. 00:12:52
    Introduction to Natural Language Processing (NLP) and Sequence Problems
  • Урок 199. 00:07:23
    Example NLP inputs and outputs
  • Урок 200. 00:09:04
    The typical architecture of a Recurrent Neural Network (RNN)
  • Урок 201. 00:08:53
    Preparing a notebook for our first NLP with TensorFlow project
  • Урок 202. 00:16:42
    Becoming one with the data and visualizing a text dataset
  • Урок 203. 00:06:27
    Splitting data into training and validation sets
  • Урок 204. 00:09:23
    Converting text data to numbers using tokenisation and embeddings (overview)
  • Урок 205. 00:17:11
    Setting up a TensorFlow TextVectorization layer to convert text to numbers
  • Урок 206. 00:11:03
    Mapping the TextVectorization layer to text data and turning it into numbers
  • Урок 207. 00:12:28
    Creating an Embedding layer to turn tokenised text into embedding vectors
  • Урок 208. 00:08:58
    Discussing the various modelling experiments we're going to run
  • Урок 209. 00:09:26
    Model 0: Building a baseline model to try and improve upon
  • Урок 210. 00:12:15
    Creating a function to track and evaluate our model's results
  • Урок 211. 00:20:52
    Model 1: Building, fitting and evaluating our first deep model on text data
  • Урок 212. 00:20:44
    Visualizing our model's learned word embeddings with TensorFlow's projector tool
  • Урок 213. 00:09:35
    High-level overview of Recurrent Neural Networks (RNNs) + where to learn more
  • Урок 214. 00:18:17
    Model 2: Building, fitting and evaluating our first TensorFlow RNN model (LSTM)
  • Урок 215. 00:16:57
    Model 3: Building, fitting and evaluating a GRU-cell powered RNN
  • Урок 216. 00:19:35
    Model 4: Building, fitting and evaluating a bidirectional RNN model
  • Урок 217. 00:19:32
    Discussing the intuition behind Conv1D neural networks for text and sequences
  • Урок 218. 00:09:58
    Model 5: Building, fitting and evaluating a 1D CNN for text
  • Урок 219. 00:13:46
    Using TensorFlow Hub for pretrained word embeddings (transfer learning for NLP)
  • Урок 220. 00:10:46
    Model 6: Building, training and evaluating a transfer learning model for NLP
  • Урок 221. 00:10:53
    Preparing subsets of data for model 7 (same as model 6 but 10% of data)
  • Урок 222. 00:10:05
    Model 7: Building, training and evaluating a transfer learning model on 10% data
  • Урок 223. 00:13:43
    Fixing our data leakage issue with model 7 and retraining it
  • Урок 224. 00:13:15
    Comparing all our modelling experiments evaluation metrics
  • Урок 225. 00:11:15
    Uploading our model's training logs to TensorBoard and comparing them
  • Урок 226. 00:10:26
    Saving and loading in a trained NLP model with TensorFlow
  • Урок 227. 00:13:25
    Downloading a pretrained model and preparing data to investigate predictions
  • Урок 228. 00:08:29
    Visualizing our model's most wrong predictions
  • Урок 229. 00:08:28
    Making and visualizing predictions on the test dataset
  • Урок 230. 00:15:02
    Understanding the concept of the speed/score tradeoff
  • Урок 231. 00:14:21
    Introduction to Milestone Project 2: SkimLit
  • Урок 232. 00:07:23
    What we're going to cover in Milestone Project 2 (NLP for medical abstracts)
  • Урок 233. 00:11:03
    SkimLit inputs and outputs
  • Урок 234. 00:14:59
    Setting up our notebook for Milestone Project 2 (getting the data)
  • Урок 235. 00:13:19
    Visualizing examples from the dataset (becoming one with the data)
  • Урок 236. 00:19:51
    Writing a preprocessing function to structure our data for modelling
  • Урок 237. 00:07:56
    Performing visual data analysis on our preprocessed text
  • Урок 238. 00:13:16
    Turning our target labels into numbers (ML models require numbers)
  • Урок 239. 00:09:26
    Model 0: Creating, fitting and evaluating a baseline model for SkimLit
  • Урок 240. 00:09:56
    Preparing our data for deep sequence models
  • Урок 241. 00:14:08
    Creating a text vectoriser to map our tokens (text) to numbers
  • Урок 242. 00:09:15
    Creating a custom token embedding layer with TensorFlow
  • Урок 243. 00:09:50
    Creating fast loading dataset with the TensorFlow tf.data API
  • Урок 244. 00:17:22
    Model 1: Building, fitting and evaluating a Conv1D with token embeddings
  • Урок 245. 00:10:54
    Preparing a pretrained embedding layer from TensorFlow Hub for Model 2
  • Урок 246. 00:11:31
    Model 2: Building, fitting and evaluating a Conv1D model with token embeddings
  • Урок 247. 00:23:25
    Creating a character-level tokeniser with TensorFlow's TextVectorization layer
  • Урок 248. 00:07:45
    Creating a character-level embedding layer with tf.keras.layers.Embedding
  • Урок 249. 00:13:46
    Model 3: Building, fitting and evaluating a Conv1D model on character embeddings
  • Урок 250. 00:06:05
    Discussing how we're going to build Model 4 (character + token embeddings)
  • Урок 251. 00:15:37
    Model 4: Building a multi-input model (hybrid token + character embeddings)
  • Урок 252. 00:07:33
    Model 4: Plotting and visually exploring different data inputs
  • Урок 253. 00:08:42
    Crafting multi-input fast loading tf.data datasets for Model 4
  • Урок 254. 00:13:19
    Model 4: Building, fitting and evaluating a hybrid embedding model
  • Урок 255. 00:07:19
    Model 5: Adding positional embeddings via feature engineering (overview)
  • Урок 256. 00:12:26
    Encoding the line number feature to used with Model 5
  • Урок 257. 00:07:57
    Encoding the total lines feature to be used with Model 5
  • Урок 258. 00:09:20
    Model 5: Building the foundations of a tribrid embedding model
  • Урок 259. 00:14:09
    Model 5: Completing the build of a tribrid embedding model for sequences
  • Урок 260. 00:10:26
    Visually inspecting the architecture of our tribrid embedding model
  • Урок 261. 00:09:01
    Creating multi-level data input pipelines for Model 5 with the tf.data API
  • Урок 262. 00:10:36
    Bringing SkimLit to life!!! (fitting and evaluating Model 5)
  • Урок 263. 00:09:37
    Comparing the performance of all of our modelling experiments
  • Урок 264. 00:07:49
    Saving, loading & testing our best performing model
  • Урок 265. 00:12:34
    Congratulations and your challenge before heading to the next module
  • Урок 266. 00:03:54
    Introduction to Milestone Project 3 (BitPredict) & where you can get help
  • Урок 267. 00:07:47
    What is a time series problem and example forecasting problems at Uber
  • Урок 268. 00:04:53
    Example forecasting problems in daily life
  • Урок 269. 00:07:58
    What can be forecast?
  • Урок 270. 00:02:36
    What we're going to cover (broadly)
  • Урок 271. 00:08:56
    Time series forecasting inputs and outputs
  • Урок 272. 00:14:59
    Downloading and inspecting our Bitcoin historical dataset
  • Урок 273. 00:07:40
    Different kinds of time series patterns & different amounts of feature variables
  • Урок 274. 00:04:53
    Visualizing our Bitcoin historical data with pandas
  • Урок 275. 00:10:59
    Reading in our Bitcoin data with Python's CSV module
  • Урок 276. 00:08:38
    Creating train and test splits for time series (the wrong way)
  • Урок 277. 00:07:13
    Creating train and test splits for time series (the right way)
  • Урок 278. 00:07:58
    Creating a plotting function to visualize our time series data
  • Урок 279. 00:09:12
    Discussing the various modelling experiments were going to be running
  • Урок 280. 00:12:17
    Model 0: Making and visualizing a naive forecast model
  • Урок 281. 00:11:12
    Discussing some of the most common time series evaluation metrics
  • Урок 282. 00:09:39
    Implementing MASE with TensorFlow
  • Урок 283. 00:10:12
    Creating a function to evaluate our model's forecasts with various metrics
  • Урок 284. 00:05:07
    Discussing other non-TensorFlow kinds of time series forecasting models
  • Урок 285. 00:13:02
    Formatting data Part 2: Creating a function to label our windowed time series
  • Урок 286. 00:07:51
    Discussing the use of windows and horizons in time series data
  • Урок 287. 00:23:36
    Writing a preprocessing function to turn time series data into windows & labels
  • Урок 288. 00:10:02
    Turning our windowed time series data into training and test sets
  • Урок 289. 00:07:26
    Creating a modelling checkpoint callback to save our best performing model
  • Урок 290. 00:16:59
    Model 1: Building, compiling and fitting a deep learning model on Bitcoin data
  • Урок 291. 00:14:04
    Creating a function to make predictions with our trained models
  • Урок 292. 00:17:44
    Model 2: Building, fitting and evaluating a deep model with a larger window size-27
  • Урок 293. 00:13:16
    Model 3: Building, fitting and evaluating a model with a larger horizon size
  • Урок 294. 00:08:35
    Adjusting the evaluation function to work for predictions with larger horizons
  • Урок 295. 00:08:45
    Model 3: Visualizing the results
  • Урок 296. 00:09:45
    Comparing our modelling experiments so far and discussing autocorrelation
  • Урок 297. 00:13:22
    Preparing data for building a Conv1D model
  • Урок 298. 00:14:52
    Model 4: Building, fitting and evaluating a Conv1D model on our Bitcoin data
  • Урок 299. 00:16:06
    Model 5: Building, fitting and evaluating a LSTM (RNN) model on our Bitcoin data
  • Урок 300. 00:13:53
    Investigating how to turn our univariate time series into multivariate
  • Урок 301. 00:12:13
    Creating and plotting a multivariate time series with BTC price and block reward
  • Урок 302. 00:13:38
    Preparing our multivariate time series for a model
  • Урок 303. 00:09:26
    Model 6: Building, fitting and evaluating a multivariate time series model
  • Урок 304. 00:09:40
    Model 7: Discussing what we're going to be doing with the N-BEATS algorithm
  • Урок 305. 00:18:39
    Model 7: Replicating the N-BEATS basic block with TensorFlow layer subclassing
  • Урок 306. 00:15:03
    Model 7: Testing our N-BEATS block implementation with dummy data inputs
  • Урок 307. 00:08:51
    Model 7: Setting up hyperparameters for the N-BEATS algorithm
  • Урок 308. 00:12:56
    Model 7: Getting ready for residual connections
  • Урок 309. 00:10:06
    Model 7: Outlining the steps we're going to take to build the N-BEATS model
  • Урок 310. 00:22:23
    Model 7: Putting together the pieces of the puzzle of the N-BEATS model
  • Урок 311. 00:06:47
    Model 7: Plotting the N-BEATS algorithm we've created and admiring its beauty
  • Урок 312. 00:04:44
    Model 8: Ensemble model overview
  • Урок 313. 00:20:05
    Model 8: Building, compiling and fitting an ensemble of models
  • Урок 314. 00:16:10
    Model 8: Making and evaluating predictions with our ensemble model
  • Урок 315. 00:12:57
    Discussing the importance of prediction intervals in forecasting
  • Урок 316. 00:07:58
    Getting the upper and lower bounds of our prediction intervals
  • Урок 317. 00:13:03
    Plotting the prediction intervals of our ensemble model predictions
  • Урок 318. 00:13:42
    (Optional) Discussing the types of uncertainty in machine learning
  • Урок 319. 00:08:25
    Model 9: Preparing data to create a model capable of predicting into the future
  • Урок 320. 00:05:02
    Model 9: Building, compiling and fitting a future predictions model
  • Урок 321. 00:08:31
    Model 9: Discussing what's required for our model to make future predictions
  • Урок 322. 00:12:09
    Model 9: Creating a function to make forecasts into the future
  • Урок 323. 00:13:10
    Model 9: Plotting our model's future forecasts
  • Урок 324. 00:14:16
    Model 10: Introducing the turkey problem and making data for it
  • Урок 325. 00:13:39
    Model 10: Building a model to predict on turkey data (why forecasting is BS)
  • Урок 326. 00:13:00
    Comparing the results of all of our models and discussing where to go next
  • Урок 327. 00:05:29
    What is the TensorFlow Developer Certification?
  • Урок 328. 00:06:58
    Why the TensorFlow Developer Certification?
  • Урок 329. 00:08:15
    How to prepare (your brain) for the TensorFlow Developer Certification
  • Урок 330. 00:12:44
    How to prepare (your computer) for the TensorFlow Developer Certification
  • Урок 331. 00:02:14
    What to do after the TensorFlow Developer Certification exam
  • Урок 332. 00:04:52
    What is Machine Learning?
  • Урок 333. 00:06:53
    AI/Machine Learning/Data Science
  • Урок 334. 00:06:17
    Exercise: Machine Learning Playground
  • Урок 335. 00:06:04
    How Did We Get Here?
  • Урок 336. 00:04:25
    Exercise: YouTube Recommendation Engine
  • Урок 337. 00:04:42
    Types of Machine Learning
  • Урок 338. 00:04:45
    What Is Machine Learning? Round 2
  • Урок 339. 00:01:49
    Section Review
  • Урок 340. 00:02:39
    Section Overview
  • Урок 341. 00:03:09
    Introducing Our Framework
  • Урок 342. 00:05:00
    6 Step Machine Learning Framework
  • Урок 343. 00:10:33
    Types of Machine Learning Problems
  • Урок 344. 00:04:51
    Types of Data
  • Урок 345. 00:03:32
    Types of Evaluation
  • Урок 346. 00:05:59
    Features In Data
  • Урок 347. 00:05:23
    Modelling - Splitting Data
  • Урок 348. 00:04:36
    Modelling - Picking the Model
  • Урок 349. 00:03:18
    Modelling - Tuning
  • Урок 350. 00:03:36
    Modelling - Comparison
  • Урок 351. 00:09:33
    Experimentation
  • Урок 352. 00:04:01
    Tools We Will Use
  • Урок 353. 00:02:28
    Section Overview
  • Урок 354. 00:04:30
    Pandas Introduction
  • Урок 355. 00:13:22
    Series, Data Frames and CSVs
  • Урок 356. 00:09:49
    Describing Data with Pandas
  • Урок 357. 00:11:09
    Selecting and Viewing Data with Pandas
  • Урок 358. 00:13:07
    Selecting and Viewing Data with Pandas Part 2
  • Урок 359. 00:13:57
    Manipulating Data
  • Урок 360. 00:09:57
    Manipulating Data 2
  • Урок 361. 00:10:13
    Manipulating Data 3
  • Урок 362. 00:02:41
    How To Download The Course Assignments
  • Урок 363. 00:07:44
    Section Overview
  • Урок 364. 00:05:18
    NumPy Introduction
  • Урок 365. 00:14:06
    NumPy DataTypes and Attributes
  • Урок 366. 00:09:23
    Creating NumPy Arrays
  • Урок 367. 00:07:18
    NumPy Random Seed
  • Урок 368. 00:09:36
    Viewing Arrays and Matrices
  • Урок 369. 00:11:32
    Manipulating Arrays
  • Урок 370. 00:09:45
    Manipulating Arrays 2
  • Урок 371. 00:07:11
    Standard Deviation and Variance
  • Урок 372. 00:07:27
    Reshape and Transpose
  • Урок 373. 00:11:46
    Dot Product vs Element Wise
  • Урок 374. 00:03:34
    Exercise: Nut Butter Store Sales
  • Урок 375. 00:13:05
    Comparison Operators
  • Урок 376. 00:06:20
    Sorting Arrays
  • Урок 377. 00:07:38
    Turn Images Into NumPy Arrays