CS 189 at UC Berkeley

# Introduction to Machine Learning

Lectures: T/Th 3:30-5 p.m., 155 Dwinelle

### Week 0 Overview

## Least Squares Framework

### Week 1 Overview

## Features, Regularization, Hyperparameters and Cross-Validation

- Note 1 : Introduction (Draft)
- Discussion 01 (solution)
- Homework 0 (TeX) (solution)
- Homework 01 (TeX) (data) (solution)

### Week 2 Overview

## MLE, MAP, OLS, Bias-Variance Tradeoffs

### Week 3 Overview

## Weighted LS, Total LS, Eigenmethods

### Week 4 Overview

## TLS, PCA

- Note 6 : Multivariate Gaussians
- Note 7 : MLE and MAP for Regression (Part II) (Draft)
- Note 8 : Kernels, Kernel Ridge Regression
- Note 9 : Total Least Squares (Draft)
- Note 10 : Principal Component Analysis (PCA) (Draft)
- Discussion 04 (solution)
- Homework 03 (TeX) (data) (solution) (solution code)
- Homework 04 (TeX) (data) (solution)
- Homework 05 (TeX) (data) (solution)

### Week 5 Overview

## CCA, Nonlinear LS, Gradient Descent

### Week 6 Overview

## Neural Nets, Stochastic Gradient Descent

### Week 7 Overview

## Regression for Classification: Generative v. Discriminative

- Note 13 : Gradient Descent, Newton's Method
- Note 14 : Neural Nets
- Note 15 : Training Neural Networks
- Discussion 07 (solution)
- Homework 07 (TeX) (data) (solution) (solution code)

### Week 8 Overview

## Loss Functions, Hinge-Loss, SVM

- Note 16 : Discriminative vs. Generative Classification, LS-SVM (Draft)
- Note 17 : Logistic Regression (Draft)
- Note 18 : QDA and LDA (Draft)
- Note 19 : Unsupervised Clustering
- Discussion 07 (solution)
- Discussion 08 (solution)
- Homework 08 (TeX) (data) (solution) (solution code)

### Week 9 Overview

## k-Means, EM

## Notes

See Syllabus for more information. You can find a list of week-by-week topics. You can find a comprehensive compilation of the notes here.

- Note 1: Introduction (Draft)
- Note 2: Linear Regression
- Note 3: Features, Hyperparameters, Validation
- Note 4: MLE and MAP for Regression (Part I)
- Note 5: Bias-Variance Tradeoff
- Note 6: Multivariate Gaussians
- Note 7: MLE and MAP for Regression (Part II) (Draft)
- Note 8: Kernels, Kernel Ridge Regression
- Note 9: Total Least Squares (Draft)
- Note 10: Principal Component Analysis (PCA) (Draft)
- Note 11: Canonical Correlation Analysis (CCA) (Draft)
- Note 12: Nonlinear Least Squares (Draft)
- Note 13: Gradient Descent, Newton's Method
- Note 14: Neural Nets
- Note 15: Training Neural Networks
- Note 16: Discriminative vs. Generative Classification, LS-SVM (Draft)
- Note 17: Logistic Regression (Draft)
- Note 18: QDA and LDA (Draft)
- Note 19: Unsupervised Clustering
- Note 20: Support Vector Machines (SVM) (Draft)
- Note 21: Duality and Dual SVMs
- Note 22: Nearest Neighbor Classification
- Note 23: Sparsity
- Note 24: Decision Trees and Random Forests
- Note 25: Boosting
- Note 26: Convolutional Neural Networks (CNN)
- Note 27: Autoencoders
- Note 28: Generative Adversarial Networks (GAN)

Expand

## Discussions

The discussion sections may cover new material and will give you additional practice solving problems. You can attend any discussion section you like. See Syllabus for more information.

- Discussion 0: Vector Calculus, Linear Algebra (solution)
- Discussion 01: Derivatives Review, Least Squares (solution)
- Discussion 02: Ridge Regression (solution)
- Discussion 03: Bias-Variance Tradeoff (solution)
- Discussion 04: Kernel and Multivariate Gaussians (solution)
- Discussion 05: Dimensionality reduction (solution)
- Discussion 06: Midterm Review (solution)
- Discussion 07: Backpropagation (solution)
- Discussion 08: GD/SGD (solution)
- Discussion 09: QDA and Logistic Regression
- Discussion 11: Kernels/Nearest Neighbors
- Discussion 13: Convolutional Neural Networks
- Discussion 14: Clustering

Expand

## Homeworks

All homeworks are partially graded and it is highly-recommended that you do them. Your lowest homework score will be dropped, but this drop should be reserved for emergencies. Here is the semester's self-grade form (See form for instructions). See Syllabus for more information.

- Homework 0: Review and Linear Regression (TeX) (solution)
- Homework 01: Least Squares (TeX) (data) (solution)
- Homework 02: Ridge Regression (TeX) (data) (solution)
- Homework 03: Probabilistic Models (TeX) (data) (solution) (solution code)
- Homework 04: Kernel methods (TeX) (data) (solution)
- Homework 05: Dimensionality reduction (TeX) (data) (solution)
- Homework 06: CCA and Midterm Redo (TeX) (data) (solution)
- Homework 07: Backpropagation (TeX) (data) (solution) (solution code)
- Homework 08: SGD and Classification (TeX) (data) (solution) (solution code)
- Homework 09: LDA, CCA (TeX) (data)
- Homework 10: Support Vector Machines
- Homework 11: Kernels and Neighbors
- Homework 12: Sparsity and Decision Trees
- Homework 13: (Convolutional) Neural Networks
- Homework 14: K-SVD

Expand