CS 189 at UC Berkeley

# Introduction to Machine Learning

Lectures: T/Th 12:30-2 p.m., 155 Dwinelle

## Instructor Stella Yu

stellayu (at) berkeley.edu

Office Hours: Tu/Th 2-3 p.m. 400 Cory (see calendar)

## Professor Anant Sahai

sahai (at) eecs.berkeley.edu

Office Hours: Tu/Th 2-3 p.m. 400 Cory (see calendar)

### Week 1 Overview

## Least Squares Framework

### Week 2 Overview

## Features, Regularization, Hyperparameters and Cross-Validation

### Week 3 Overview

## MLE, MAP, OLS, Bias-Variance Tradeoffs

### Week 4 Overview

## Weighted LS, Total LS, Eigenmethods

### Week 5 Overview

## CCA, Feature Discovery

### Week 6 Overview

## Nonlinear LS, Gradient Descent

### Week 7 Overview

## Neural Nets, Stochastic Gradient Descent

- Note 12 : Neural Nets: Introduction (Draft)
- Note 13 : Backpropagation (Draft)
- Discussion 07 (solution)
- Homework 05 (TeX) (data) (solution) (self-grade)
- Homework 06 (TeX) (data) (solution) (self-grade)

### Week 8 Overview

## Regression for Classification: Generative v. Discriminative

### Week 9 Overview

## Loss Functions, Hinge-Loss, SVM

### Week 10 Overview

## Kernel Methods, Nearest Neighbor Techniques

- Midterm 1
- Note 19 : Kernel Trick (Draft)
- Discussion 10 (solution)
- Discussion 11 (solution)
- Homework 08 (TeX) (solution) (self-grade)
- Homework 09 (TeX) (solution) (self-grade)

### Week 11 Overview

## Decision Trees, Boosting, Ensemble Methods

- Note 20 : Nearest Neighbor Classification (Draft)
- Note 21 : Sparsity, LASSO (Draft)
- Discussion 11 (solution)
- Homework 09 (TeX) (solution) (self-grade)
- Homework 10 (TeX) (solution) (self-grade)

### Week 12 Overview

## Convolutional Neural Nets, Regularization Revisited

- Note 22 : Coordinate Descent (Draft)
- Note 23 : Decision Trees and Random Forests (Draft)
- Homework 10 (TeX) (solution) (self-grade)
- Homework 11 (TeX) (solution) (self-grade)

## Notes

See Syllabus for more information. You can find a list of week-by-week topics.

- Note 1: Least Squares
- Note 2: Feature Engineering, Ridge Regression
- Note 3: Hyperparameters, Cross-Validation
- Note 4: Gaussians, MLE, MAP
- Note 5: Bias-Variance Tradeoff
- Note 6: Weighted Least Squares, Multivariate Gaussians
- Note 7: MAP with Colored Noise
- Note 8: Total Least Squares
- Note 9: Principal Component Analysis (PCA)
- Note 10: Canonical Correlation Analysis (CCA)
- Note 11: Nonlinear Least Squares
- Note 12: Neural Nets: Introduction (Draft)
- Note 13: Backpropagation (Draft)
- Note 14: QDA/LDA, More Multivariate Gaussians
- Note 15: Discriminative Models, Logistic Regression
- Note 16: Training Logistic Regression, Multiclass Logistic Regression (Draft)
- Note 17: Support Vector Machines (SVM)
- Note 19: Kernel Trick (Draft)
- Note 20: Nearest Neighbor Classification (Draft)
- Note 21: Sparsity, LASSO (Draft)
- Note 22: Coordinate Descent (Draft)
- Note 23: Decision Trees and Random Forests (Draft)
- Note 24: AdaBoost (Draft)
- Note 25: Convolutional Neural Networks (Draft)

## Discussions

The discussion sections may cover new material and will give you additional practice solving problems. You can attend any discussion section you like. However, if there are fewer desks than students, then students who are officially enrolled in the course will get seating priority. See Syllabus for more information.

- Discussion 01: Review, Least Squares (solution)
- Discussion 02: Ridge Regression
- Discussion 03: Bias-Variance Tradeoff (solution)
- Discussion 04: Multivariate Gaussians (solution)
- Discussion 05: PCA, CCA, and Convexity (solution)
- Discussion 06: Gradient Descent (solution)
- Discussion 07: Backpropagation (solution)
- Discussion 09: LDA/QDA/SGD (solution)
- Discussion 10: SGD/SVM (solution)
- Discussion 11: Kernels/Nearest Neighbors (solution)

## Homeworks

All homeworks are graded and it is highly-recommended that you do them. Your lowest homework score will be dropped, but this drop should be reserved for emergencies. See Syllabus for more information.

- Homework 0: Course Logistics (solution) (self-grade)
- Homework 01: Review and Least Squares (TeX) (data) (solution) (self-grade)
- Homework 02: Ridge Regression (TeX) (data) (solution) (self-grade)
- Homework 03: Probabilistic Models (TeX) (data) (solution) (self-grade)
- Homework 04: Total Least Squares (TeX) (data) (solution) (self-grade)
- Homework 05: Canonical-Correlation Analysis (TeX) (data) (solution) (self-grade)
- Homework 06: Gradient Descent (TeX) (data) (solution) (self-grade)
- Homework 07: Backpropagation (TeX) (solution) (self-grade) (solution code)
- Homework 08: Midterm Redo (TeX) (solution) (self-grade)
- Homework 09: Classification and SGD (TeX) (solution) (self-grade)
- Homework 10: Support Vector Machines (TeX) (solution) (self-grade)
- Homework 11: Kernels and Neighbors (TeX) (solution) (self-grade)