Урок 1. 00:04:29
[Important] Getting the most out of this course
Урок 2. 00:04:10
About using MATLAB or Python
Урок 3. 00:08:48
Statistics guessing game!
Урок 4. 00:05:17
Using the Q&A forum
Урок 5. 00:01:53
(optional) Entering time-stamped notes in the Udemy video player
Урок 6. 00:03:13
Should you memorize statistical formulas?
Урок 7. 00:04:03
Arithmetic and exponents
Урок 8. 00:05:54
Scientific notation
Урок 9. 00:04:22
Summation notation
Урок 10. 00:03:05
Absolute value
Урок 11. 00:05:54
Natural exponent and logarithm
Урок 12. 00:08:59
The logistic function
Урок 13. 00:06:31
Rank and tied-rank
Урок 14. 00:03:49
Download materials for the entire course!
Урок 15. 00:01:54
Is "data" singular or plural?!?!!?!
Урок 16. 00:06:10
Where do data come from and what do they mean?
Урок 17. 00:14:57
Types of data: categorical, numerical, etc
Урок 18. 00:08:59
Code: representing types of data on computers
Урок 19. 00:12:03
Sample vs. population data
Урок 20. 00:05:32
Samples, case reports, and anecdotes
Урок 21. 00:06:58
The ethics of making up data
Урок 22. 00:11:39
Bar plots
Урок 23. 00:17:00
Code: bar plots
Урок 24. 00:05:42
Box-and-whisker plots
Урок 25. 00:08:42
Code: box plots
Урок 26. 00:02:32
"Unsupervised learning": Boxplots of normal and uniform noise
Урок 27. 00:11:17
Histograms
Урок 28. 00:16:41
Code: histograms
Урок 29. 00:02:23
"Unsupervised learning": Histogram proportion
Урок 30. 00:06:00
Pie charts
Урок 31. 00:13:23
Code: pie charts
Урок 32. 00:06:12
When to use lines instead of bars
Урок 33. 00:09:05
Linear vs. logarithmic axis scaling
Урок 34. 00:07:25
Code: line plots
Урок 35. 00:01:45
"Unsupervised learning": log-scaled plots
Урок 36. 00:04:32
Descriptive vs. inferential statistics
Урок 37. 00:07:30
Accuracy, precision, resolution
Урок 38. 00:11:27
Data distributions
Урок 39. 00:32:09
Code: data from different distributions
Урок 40. 00:01:58
"Unsupervised learning": histograms of distributions
Урок 41. 00:05:30
The beauty and simplicity of Normal
Урок 42. 00:12:48
Measures of central tendency (mean)
Урок 43. 00:12:18
Measures of central tendency (median, mode)
Урок 44. 00:13:59
Code: computing central tendency
Урок 45. 00:03:09
"Unsupervised learning": central tendencies with outliers
Урок 46. 00:17:49
Measures of dispersion (variance, standard deviation)
Урок 47. 00:26:34
Code: Computing dispersion
Урок 48. 00:04:54
Interquartile range (IQR)
Урок 49. 00:15:59
Code: IQR
Урок 50. 00:07:23
QQ plots
Урок 51. 00:15:35
Code: QQ plots
Урок 52. 00:08:24
Statistical "moments"
Урок 53. 00:10:01
Histograms part 2: Number of bins
Урок 54. 00:12:25
Code: Histogram bins
Урок 55. 00:03:20
Violin plots
Урок 56. 00:10:10
Code: violin plots
Урок 57. 00:02:32
"Unsupervised learning": asymmetric violin plots
Урок 58. 00:11:03
Shannon entropy
Урок 59. 00:20:16
Code: entropy
Урок 60. 00:01:27
"Unsupervised learning": entropy and number of bins
Урок 61. 00:04:11
Garbage in, garbage out (GIGO)
Урок 62. 00:09:26
Z-score standardization
Урок 63. 00:12:51
Code: z-score
Урок 64. 00:05:07
Min-max scaling
Урок 65. 00:08:17
Code: min-max scaling
Урок 66. 00:02:36
"Unsupervised learning": Invert the min-max scaling
Урок 67. 00:14:27
What are outliers and why are they dangerous?
Урок 68. 00:09:27
Removing outliers: z-score method
Урок 69. 00:04:05
The modified z-score method
Урок 70. 00:22:31
Code: z-score for outlier removal
Урок 71. 00:02:39
"Unsupervised learning": z vs. modified-z
Урок 72. 00:09:27
Multivariate outlier detection
Урок 73. 00:09:02
Code: Euclidean distance for outlier removal
Урок 74. 00:05:48
Removing outliers by data trimming
Урок 75. 00:11:04
Code: Data trimming to remove outliers
Урок 76. 00:04:41
Non-parametric solutions to outliers
Урок 77. 00:03:05
An outlier lecture on personal accountability
Урок 78. 00:12:18
What is probability?
Урок 79. 00:09:26
Probability vs. proportion
Урок 80. 00:10:29
Computing probabilities
Урок 81. 00:14:35
Code: compute probabilities
Урок 82. 00:04:59
Probability and odds
Урок 83. 00:02:31
"Unsupervised learning": probabilities of odds-space
Урок 84. 00:13:07
Probability mass vs. density
Урок 85. 00:11:38
Code: compute probability mass functions
Урок 86. 00:10:45
Cumulative probability distributions
Урок 87. 00:09:42
Code: cdfs and pdfs
Урок 88. 00:02:26
"Unsupervised learning": cdf's for various distributions
Урок 89. 00:18:32
Creating sample estimate distributions
Урок 90. 00:02:54
Monte Carlo sampling
Урок 91. 00:08:42
Sampling variability, noise, and other annoyances
Урок 92. 00:26:16
Code: sampling variability
Урок 93. 00:10:10
Expected value
Урок 94. 00:12:46
Conditional probability
Урок 95. 00:20:13
Code: conditional probabilities
Урок 96. 00:06:25
Tree diagrams for conditional probabilities
Урок 97. 00:09:51
The Law of Large Numbers
Урок 98. 00:19:24
Code: Law of Large Numbers in action
Урок 99. 00:10:35
The Central Limit Theorem
Урок 100. 00:16:22
Code: the CLT in action
Урок 101. 00:02:10
"Unsupervised learning": Averaging pairs of numbers
Урок 102. 00:16:46
IVs, DVs, models, and other stats lingo
Урок 103. 00:15:09
What is an hypothesis and how do you specify one?
Урок 104. 00:10:39
Sample distributions under null and alternative hypotheses
Урок 105. 00:18:57
P-values: definition, tails, and misinterpretations
Урок 106. 00:06:52
P-z combinations that you should memorize
Урок 107. 00:12:22
Degrees of freedom
Урок 108. 00:14:19
Type 1 and Type 2 errors
Урок 109. 00:09:13
Parametric vs. non-parametric tests
Урок 110. 00:08:34
Multiple comparisons and Bonferroni correction
Урок 111. 00:06:52
Statistical vs. theoretical vs. clinical significance
Урок 112. 00:11:31
Cross-validation
Урок 113. 00:11:13
Statistical significance vs. classification accuracy
Урок 114. 00:13:14
Purpose and interpretation of the t-test
Урок 115. 00:08:10
One-sample t-test
Урок 116. 00:20:47
Code: One-sample t-test
Урок 117. 00:02:51
"Unsupervised learning": The role of variance
Урок 118. 00:13:07
Two-samples t-test
Урок 119. 00:22:10
Code: Two-samples t-test
Урок 120. 00:04:46
"Unsupervised learning": Importance of N for t-test
Урок 121. 00:07:37
Wilcoxon signed-rank (nonparametric t-test)
Урок 122. 00:18:35
Code: Signed-rank test
Урок 123. 00:06:04
Mann-Whitney U test (nonparametric t-test)
Урок 124. 00:05:22
Code: Mann-Whitney U test
Урок 125. 00:11:27
Permutation testing for t-test significance
Урок 126. 00:25:27
Code: permutation testing
Урок 127. 00:05:22
"Unsupervised learning": How many permutations?
Урок 128. 00:08:46
What are confidence intervals and why do we need them?
Урок 129. 00:06:44
Computing confidence intervals via formula
Урок 130. 00:17:12
Code: compute confidence intervals by formula
Урок 131. 00:08:59
Confidence intervals via bootstrapping (resampling)
Урок 132. 00:14:34
Code: bootstrapping confidence intervals
Урок 133. 00:01:26
"Unsupervised learning:" Confidence intervals for variance
Урок 134. 00:06:23
Misconceptions about confidence intervals
Урок 135. 00:18:21
Motivation and description of correlation
Урок 136. 00:14:10
Covariance and correlation: formulas
Урок 137. 00:27:50
Code: correlation coefficient
Урок 138. 00:13:51
Code: Simulate data with specified correlation
Урок 139. 00:09:35
Correlation matrix
Урок 140. 00:20:26
Code: correlation matrix
Урок 141. 00:02:52
"Unsupervised learning": average correlation matrices
Урок 142. 00:04:17
"Unsupervised learning": correlation to covariance matrix
Урок 143. 00:10:24
Partial correlation
Урок 144. 00:19:56
Code: partial correlation
Урок 145. 00:06:44
The problem with Pearson
Урок 146. 00:07:18
Nonparametric correlation: Spearman rank
Урок 147. 00:06:55
Fisher-Z transformation for correlations
Урок 148. 00:07:41
Code: Spearman correlation and Fisher-Z
Урок 149. 00:01:29
"Unsupervised learning": Spearman correlation
Урок 150. 00:02:26
"Unsupervised learning": confidence interval on correlation
Урок 151. 00:10:38
Kendall's correlation for ordinal data
Урок 152. 00:18:10
Code: Kendall correlation
Урок 153. 00:02:40
"Unsupervised learning": Does Kendall vs. Pearson matter?
Урок 154. 00:04:42
The subgroups correlation paradox
Урок 155. 00:05:27
Cosine similarity
Урок 156. 00:21:20
Code: Cosine similarity vs. Pearson correlation
Урок 157. 00:17:52
ANOVA intro, part1
Урок 158. 00:19:57
ANOVA intro, part 2
Урок 159. 00:18:14
Sum of squares
Урок 160. 00:07:29
The F-test and the ANOVA table
Урок 161. 00:12:39
The omnibus F-test and post-hoc comparisons
Урок 162. 00:19:53
The two-way ANOVA
Урок 163. 00:13:25
One-way ANOVA example
Урок 164. 00:16:35
Code: One-way ANOVA (independent samples)
Урок 165. 00:12:18
Code: One-way repeated-measures ANOVA
Урок 166. 00:11:18
Two-way ANOVA example
Урок 167. 00:14:29
Code: Two-way mixed ANOVA
Урок 168. 00:19:54
Introduction to GLM / regression
Урок 169. 00:09:47
Least-squares solution to the GLM
Урок 170. 00:16:18
Evaluating regression models: R2 and F
Урок 171. 00:13:18
Simple regression
Урок 172. 00:09:13
Code: simple regression
Урок 173. 00:01:06
"Unsupervised learning": Compute R2 and F
Урок 174. 00:13:02
Multiple regression
Урок 175. 00:12:19
Standardizing regression coefficients
Урок 176. 00:18:43
Code: Multiple regression
Урок 177. 00:08:57
Polynomial regression models
Урок 178. 00:15:47
Code: polynomial modeling
Урок 179. 00:00:53
"Unsupervised learning": Polynomial design matrix
Урок 180. 00:16:56
Logistic regression
Урок 181. 00:09:28
Code: Logistic regression
Урок 182. 00:16:20
Under- and over-fitting
Урок 183. 00:01:57
"Unsupervised learning": Overfit data
Урок 184. 00:12:26
Comparing "nested" models
Урок 185. 00:06:37
What to do about missing data
Урок 186. 00:09:48
What is statistical power and why is it important?
Урок 187. 00:11:23
Estimating statistical power and sample size
Урок 188. 00:04:11
Compute power and sample size using G*Power
Урок 189. 00:13:47
K-means clustering
Урок 190. 00:22:12
Code: k-means clustering
Урок 191. 00:01:54
"Unsupervised learning:" K-means and normalization
Урок 192. 00:01:27
"Unsupervised learning:" K-means on a Gauss blur
Урок 193. 00:14:19
Clustering via dbscan
Урок 194. 00:33:04
Code: dbscan
Урок 195. 00:03:05
"Unsupervised learning": dbscan vs. k-means
Урок 196. 00:06:21
K-nearest neighbor classification
Урок 197. 00:11:49
Code: KNN
Урок 198. 00:16:35
Principal components analysis (PCA)
Урок 199. 00:17:33
Code: PCA
Урок 200. 00:01:36
"Unsupervised learning:" K-means on PC data
Урок 201. 00:12:46
Independent components analysis (ICA)
Урок 202. 00:12:41
Code: ICA
Урок 203. 00:05:30
The two perspectives of the world
Урок 204. 00:12:31
d-prime
Урок 205. 00:15:03
Code: d-prime
Урок 206. 00:08:03
Response bias
Урок 207. 00:04:16
Code: Response bias
Урок 208. 00:07:35
Receiver operating characteristics (ROC)
Урок 209. 00:08:11
Code: ROC curves
Урок 210. 00:01:34
"Unsupervised learning": Make this plot look nicer!