CS 189 at UC Berkeley

# Introduction to Machine Learning

Lectures: 2:30 - 4pm Mon-Thurs in LeConte 4

### Week 0 Overview

## Linear Regression, Features, Hyperparameters and Cross-Validation

- Discussion 0
- Discussion 01
- Homework 0
- Homework 01

## Notes

See Syllabus for more information. You can find a list of week-by-week topics. Notes are not a substitute for going to lecture, as additional material may be covered in lecture. Notes are from a previous iteration of the course and may not be comprehensive. Refer to lectures.

- Note 1: Introduction
- Note 2: Linear Regression
- Note 3: Features, Hyperparameters, Validation
- Note 4: MLE and MAP for Regression (Part I)
- Note 5: Bias-Variance Tradeoff
- Note 6: Multivariate Gaussians
- Note 7: MLE and MAP for Regression (Part II)
- Note 8: Kernels, Kernel Ridge Regression
- Note 9: Total Least Squares
- Note 10: Principal Component Analysis (PCA)
- Note 11: Canonical Correlation Analysis (CCA)
- Note 12: Nonlinear Least Squares
- Note 13: Optimization
- Note 14: Neural Networks
- Note 15: Training Neural Networks
- Note 16: Discriminative vs. Generative Classification, LS-SVM
- Note 17: Logistic Regression
- Note 18: Gaussian Discriminant Analysis
- Note 19: Expectation-Maximization (EM) Algorithm, k-means Clustering
- Note 20: Support Vector Machines (SVM)
- Note 21: Generalization and Stability
- Note 22: Duality
- Note 23: Nearest Neighbor Classification
- Note 24: Sparsity
- Note 25: Decision Trees and Random Forests
- Note 26: Boosting
- Note 27: Convolutional Neural Networks (CNN)

Expand

## Discussions

The discussion sections may cover new material and will give you additional practice solving problems. You can attend any discussion section you like. See Syllabus for more information.

- Discussion 0: Vector Calculus, Linear Algebra
- Discussion 01: Derivatives Review, Least Squares
- Discussion 02: Ridge Regression
- Discussion 03: Bias-Variance Tradeoff
- Discussion 04: Kernel and Multivariate Gaussians
- Discussion 05: Dimensionality reduction
- Discussion 06: Midterm Review
- Discussion 07: Backpropagation
- Discussion 08: GD/SGD
- Discussion 09: QDA and Logistic Regression
- Discussion 10: Expectation Maximization
- Discussion 11: SVMs/Nearest Neighbors
- Discussion 12: Orthogonal Matching Pursuit
- Discussion 13: Convolutional Neural Networks
- Discussion 14: Clustering

Expand

## Homeworks

All homeworks are fully graded. Your lowest homework score will be dropped, but this drop should be reserved for emergencies. See Syllabus for more information.

- Homework 0: Review and Linear Regression
- Homework 01: Least Squares
- Homework 02: Ridge Regression
- Homework 03: Probabilistic Models
- Homework 04: Kernel methods
- Homework 05: Dimensionality reduction
- Homework 06: CCA and Midterm Redo
- Homework 07: Backpropagation
- Homework 08: SGD and Classification
- Homework 09: LDA, CCA
- Homework 10: K Means and EM
- Homework 11: SVMs and Neighbors
- Homework 12: Sparsity and Decision Trees
- Homework 13: Boosting, Convolutional Neural Networks
- Homework 14: K-SVD and Dropout

Expand