1. Урок 1. 00:00:58
    Getting Started - How to Get Help
  2. Урок 2. 00:06:05
    Solving Machine Learning Problems
  3. Урок 3. 00:09:54
    A Complete Walkthrough
  4. Урок 4. 00:02:02
    App Setup
  5. Урок 5. 00:02:54
    Problem Outline
  6. Урок 6. 00:04:12
    Identifying Relevant Data
  7. Урок 7. 00:05:48
    Dataset Structures
  8. Урок 8. 00:04:00
    Recording Observation Data
  9. Урок 9. 00:04:36
    What Type of Problem?
  10. Урок 10. 00:08:24
    How K-Nearest Neighbor Works
  11. Урок 11. 00:09:57
    Lodash Review
  12. Урок 12. 00:07:17
    Implementing KNN
  13. Урок 13. 00:05:54
    Finishing KNN Implementation
  14. Урок 14. 00:04:49
    Testing the Algorithm
  15. Урок 15. 00:04:13
    Interpreting Bad Results
  16. Урок 16. 00:04:06
    Test and Training Data
  17. Урок 17. 00:03:49
    Randomizing Test Data
  18. Урок 18. 00:03:42
    Generalizing KNN
  19. Урок 19. 00:05:19
    Gauging Accuracy
  20. Урок 20. 00:03:30
    Printing a Report
  21. Урок 21. 00:05:14
    Refactoring Accuracy Reporting
  22. Урок 22. 00:11:39
    Investigating Optimal K Values
  23. Урок 23. 00:06:37
    Updating KNN for Multiple Features
  24. Урок 24. 00:03:57
    Multi-Dimensional KNN
  25. Урок 25. 00:09:51
    N-Dimension Distance
  26. Урок 26. 00:08:28
    Arbitrary Feature Spaces
  27. Урок 27. 00:05:37
    Magnitude Offsets in Features
  28. Урок 28. 00:07:33
    Feature Normalization
  29. Урок 29. 00:07:15
    Normalization with MinMax
  30. Урок 30. 00:04:23
    Applying Normalization
  31. Урок 31. 00:07:48
    Feature Selection with KNN
  32. Урок 32. 00:06:11
    Objective Feature Picking
  33. Урок 33. 00:02:54
    Evaluating Different Feature Values
  34. Урок 34. 00:07:28
    Let's Get Our Bearings
  35. Урок 35. 00:04:32
    A Plan to Move Forward
  36. Урок 36. 00:12:05
    Tensor Shape and Dimension
  37. Урок 37. 00:08:19
    Elementwise Operations
  38. Урок 38. 00:06:48
    Broadcasting Operations
  39. Урок 39. 00:03:48
    Logging Tensor Data
  40. Урок 40. 00:05:25
    Tensor Accessors
  41. Урок 41. 00:07:47
    Creating Slices of Data
  42. Урок 42. 00:05:29
    Tensor Concatenation
  43. Урок 43. 00:05:14
    Summing Values Along an Axis
  44. Урок 44. 00:07:48
    Massaging Dimensions with ExpandDims
  45. Урок 45. 00:04:57
    KNN with Regression
  46. Урок 46. 00:04:05
    A Change in Data Structure
  47. Урок 47. 00:09:19
    KNN with Tensorflow
  48. Урок 48. 00:06:31
    Maintaining Order Relationships
  49. Урок 49. 00:08:01
    Sorting Tensors
  50. Урок 50. 00:07:44
    Averaging Top Values
  51. Урок 51. 00:03:27
    Moving to the Editor
  52. Урок 52. 00:10:11
    Loading CSV Data
  53. Урок 53. 00:06:11
    Running an Analysis
  54. Урок 54. 00:06:27
    Reporting Error Percentages
  55. Урок 55. 00:07:34
    Normalization or Standardization?
  56. Урок 56. 00:07:38
    Numerical Standardization with Tensorflow
  57. Урок 57. 00:04:02
    Applying Standardization
  58. Урок 58. 00:08:15
    Debugging Calculations
  59. Урок 59. 00:04:01
    What Now?
  60. Урок 60. 00:02:40
    Linear Regression
  61. Урок 61. 00:04:53
    Why Linear Regression?
  62. Урок 62. 00:13:05
    Understanding Gradient Descent
  63. Урок 63. 00:10:20
    Guessing Coefficients with MSE
  64. Урок 64. 00:05:57
    Observations Around MSE
  65. Урок 65. 00:07:13
    Derivatives!
  66. Урок 66. 00:11:47
    Gradient Descent in Action
  67. Урок 67. 00:05:47
    Quick Breather and Review
  68. Урок 68. 00:17:06
    Why a Learning Rate?
  69. Урок 69. 00:03:49
    Answering Common Questions
  70. Урок 70. 00:04:44
    Gradient Descent with Multiple Terms
  71. Урок 71. 00:10:40
    Multiple Terms in Action
  72. Урок 72. 00:06:02
    Project Overview
  73. Урок 73. 00:05:18
    Data Loading
  74. Урок 74. 00:08:33
    Default Algorithm Options
  75. Урок 75. 00:03:19
    Formulating the Training Loop
  76. Урок 76. 00:09:25
    Initial Gradient Descent Implementation
  77. Урок 77. 00:06:53
    Calculating MSE Slopes
  78. Урок 78. 00:03:12
    Updating Coefficients
  79. Урок 79. 00:10:08
    Interpreting Results
  80. Урок 80. 00:07:10
    Matrix Multiplication
  81. Урок 81. 00:06:41
    More on Matrix Multiplication
  82. Урок 82. 00:06:22
    Matrix Form of Slope Equations
  83. Урок 83. 00:09:29
    Simplification with Matrix Multiplication
  84. Урок 84. 00:14:02
    How it All Works Together!
  85. Урок 85. 00:07:41
    Refactoring the Linear Regression Class
  86. Урок 86. 00:08:59
    Refactoring to One Equation
  87. Урок 87. 00:06:14
    A Few More Changes
  88. Урок 88. 00:03:20
    Same Results? Or Not?
  89. Урок 89. 00:08:38
    Calculating Model Accuracy
  90. Урок 90. 00:07:45
    Implementing Coefficient of Determination
  91. Урок 91. 00:07:48
    Dealing with Bad Accuracy
  92. Урок 92. 00:04:37
    Reminder on Standardization
  93. Урок 93. 00:03:39
    Data Processing in a Helper Method
  94. Урок 94. 00:05:58
    Reapplying Standardization
  95. Урок 95. 00:05:37
    Fixing Standardization Issues
  96. Урок 96. 00:03:16
    Massaging Learning Rates
  97. Урок 97. 00:11:45
    Moving Towards Multivariate Regression
  98. Урок 98. 00:07:29
    Refactoring for Multivariate Analysis
  99. Урок 99. 00:08:05
    Learning Rate Optimization
  100. Урок 100. 00:05:22
    Recording MSE History
  101. Урок 101. 00:06:42
    Updating Learning Rate
  102. Урок 102. 00:04:18
    Observing Changing Learning Rate and MSE
  103. Урок 103. 00:05:22
    Plotting MSE Values
  104. Урок 104. 00:04:23
    Plotting MSE History against B Values
  105. Урок 105. 00:07:18
    Batch and Stochastic Gradient Descent
  106. Урок 106. 00:05:07
    Refactoring Towards Batch Gradient Descent
  107. Урок 107. 00:06:03
    Determining Batch Size and Quantity
  108. Урок 108. 00:07:49
    Iterating Over Batches
  109. Урок 109. 00:05:42
    Evaluating Batch Gradient Descent Results
  110. Урок 110. 00:07:38
    Making Predictions with the Model
  111. Урок 111. 00:02:28
    Introducing Logistic Regression
  112. Урок 112. 00:06:32
    Logistic Regression in Action
  113. Урок 113. 00:05:32
    Bad Equation Fits
  114. Урок 114. 00:04:32
    The Sigmoid Equation
  115. Урок 115. 00:07:48
    Decision Boundaries
  116. Урок 116. 00:01:12
    Changes for Logistic Regression
  117. Урок 117. 00:05:52
    Project Setup for Logistic Regression
  118. Урок 118. 00:04:28
    Importing Vehicle Data
  119. Урок 119. 00:04:19
    Encoding Label Values
  120. Урок 120. 00:07:09
    Updating Linear Regression for Logistic Regression
  121. Урок 121. 00:04:28
    The Sigmoid Equation with Logistic Regression
  122. Урок 122. 00:07:47
    A Touch More Refactoring
  123. Урок 123. 00:03:28
    Gauging Classification Accuracy
  124. Урок 124. 00:05:17
    Implementing a Test Function
  125. Урок 125. 00:07:17
    Variable Decision Boundaries
  126. Урок 126. 00:05:47
    Mean Squared Error vs Cross Entropy
  127. Урок 127. 00:05:09
    Refactoring with Cross Entropy
  128. Урок 128. 00:04:37
    Finishing the Cost Refactor
  129. Урок 129. 00:03:25
    Plotting Changing Cost History
  130. Урок 130. 00:02:20
    Multinominal Logistic Regression
  131. Урок 131. 00:05:08
    A Smart Refactor to Multinominal Analysis
  132. Урок 132. 00:03:46
    A Smarter Refactor!
  133. Урок 133. 00:09:51
    A Single Instance Approach
  134. Урок 134. 00:04:40
    Refactoring to Multi-Column Weights
  135. Урок 135. 00:04:38
    A Problem to Test Multinominal Classification
  136. Урок 136. 00:04:42
    Classifying Continuous Values
  137. Урок 137. 00:06:20
    Training a Multinominal Model
  138. Урок 138. 00:09:57
    Marginal vs Conditional Probability
  139. Урок 139. 00:06:09
    Sigmoid vs Softmax
  140. Урок 140. 00:04:43
    Refactoring Sigmoid to Softmax
  141. Урок 141. 00:02:37
    Implementing Accuracy Gauges
  142. Урок 142. 00:03:16
    Calculating Accuracy
  143. Урок 143. 00:02:11
    Handwriting Recognition
  144. Урок 144. 00:05:12
    Greyscale Values
  145. Урок 145. 00:03:30
    Many Features
  146. Урок 146. 00:06:07
    Flattening Image Data
  147. Урок 147. 00:05:45
    Encoding Label Values
  148. Урок 148. 00:07:27
    Implementing an Accuracy Gauge
  149. Урок 149. 00:01:56
    Unchanging Accuracy
  150. Урок 150. 00:08:13
    Debugging the Calculation Process
  151. Урок 151. 00:06:16
    Dealing with Zero Variances
  152. Урок 152. 00:02:37
    Backfilling Variance
  153. Урок 153. 00:04:15
    Handing Large Datasets
  154. Урок 154. 00:04:51
    Minimizing Memory Usage
  155. Урок 155. 00:05:15
    Creating Memory Snapshots
  156. Урок 156. 00:06:50
    The Javascript Garbage Collector
  157. Урок 157. 00:05:51
    Shallow vs Retained Memory Usage
  158. Урок 158. 00:08:30
    Measuring Memory Usage
  159. Урок 159. 00:03:15
    Releasing References
  160. Урок 160. 00:03:51
    Measuring Footprint Reduction
  161. Урок 161. 00:01:32
    Optimization Tensorflow Memory Usage
  162. Урок 162. 00:04:41
    Tensorflow's Eager Memory Usage
  163. Урок 163. 00:02:49
    Cleaning up Tensors with Tidy
  164. Урок 164. 00:03:32
    Implementing TF Tidy
  165. Урок 165. 00:03:58
    Tidying the Training Loop
  166. Урок 166. 00:01:35
    Measuring Reduced Memory Usage
  167. Урок 167. 00:02:36
    One More Optimization
  168. Урок 168. 00:02:45
    Final Memory Report
  169. Урок 169. 00:04:04
    Plotting Cost History
  170. Урок 170. 00:04:19
    NaN in Cost History
  171. Урок 171. 00:04:47
    Fixing Cost History
  172. Урок 172. 00:01:41
    Massaging Learning Parameters
  173. Урок 173. 00:04:28
    Improving Model Accuracy
  174. Урок 174. 00:02:07
    Loading CSV Files
  175. Урок 175. 00:02:01
    A Test Dataset
  176. Урок 176. 00:03:09
    Reading Files from Disk
  177. Урок 177. 00:02:55
    Splitting into Columns
  178. Урок 178. 00:02:31
    Dropping Trailing Columns
  179. Урок 179. 00:03:37
    Parsing Number Values
  180. Урок 180. 00:04:20
    Custom Value Parsing
  181. Урок 181. 00:05:36
    Extracting Data Columns
  182. Урок 182. 00:05:14
    Shuffling Data via Seed Phrase
  183. Урок 183. 00:07:45
    Splitting Test and Training