CS 189 at UC Berkeley

# Introduction to Machine Learning

Lectures: T/Th 3:30-5 p.m., 155 Dwinelle

### Week 0 Overview

## Least Squares Framework

## Notes

See Syllabus for more information. You can find a list of week-by-week topics. You can find a comprehensive compilation of the notes here.

- Note 1: Least Squares (Draft)
- Note 2: Feature Engineering, Ridge Regression (Draft)
- Note 3: Hyperparameters, Cross-Validation (Draft)
- Note 4: Gaussians, MLE, MAP (Draft)
- Note 5: Bias-Variance Tradeoff (Draft)
- Note 6: Weighted Least Squares, Multivariate Gaussians (Draft)
- Note 7: MAP with Colored Noise (Draft)
- Note 8: Total Least Squares (Draft)
- Note 9: Principal Component Analysis (PCA) (Draft)
- Note 10: Canonical Correlation Analysis (CCA) (Draft)
- Note 11: Nonlinear Least Squares (Draft)
- Note 12: Neural Nets: Introduction (Draft)
- Note 13: Backpropagation (Draft)
- Note 14: QDA/LDA, More Multivariate Gaussians (Draft)
- Note 15: Discriminative Models, Logistic Regression (Draft)
- Note 16: Training Logistic Regression, Multiclass Logistic Regression (Draft)
- Note 17: Support Vector Machines (SVM) (Draft)
- Note 18: Duality and Dual SVMs (Draft)
- Note 19: Kernels (Draft)
- Note 20: Nearest Neighbor Classification (Draft)
- Note 21: Sparsity (Draft)
- Note 22: Decision Trees and Random Forests (Draft)
- Note 23: Boosting (Draft)
- Note 24: Convolutional Neural Networks (CNN) (Draft)
- Note 25: Dimensionality Reduction (Draft)

Expand

## Discussions

The discussion sections may cover new material and will give you additional practice solving problems. You can attend any discussion section you like. See Syllabus for more information.

- Discussion 01: Review, Least Squares
- Discussion 02: Ridge Regression
- Discussion 03: Bias-Variance Tradeoff
- Discussion 04: Multivariate Gaussians
- Discussion 05: PCA, CCA, and Convexity
- Discussion 06: Gradient Descent
- Discussion 07: Backpropagation
- Discussion 09: LDA/QDA/SGD
- Discussion 10: SGD/SVM
- Discussion 11: Kernels/Nearest Neighbors
- Discussion 13: Convolutional Neural Networks
- Discussion 14: Clustering

Expand

## Homeworks

All homeworks are partially graded and it is highly-recommended that you do them. Your lowest homework score will be dropped, but this drop should be reserved for emergencies. Here is the semester's self-grade form (See form for instructions). See Syllabus for more information.

- Homework 0: Course Logistics
- Homework 01: Review and Least Squares
- Homework 02: Ridge Regression
- Homework 03: Probabilistic Models
- Homework 04: Total Least Squares
- Homework 05: Canonical-Correlation Analysis
- Homework 06: Gradient Descent
- Homework 07: Backpropagation
- Homework 08: Midterm Redo
- Homework 09: Classification and SGD
- Homework 10: Support Vector Machines
- Homework 11: Kernels and Neighbors
- Homework 12: Sparsity and Decision Trees
- Homework 13: (Convolutional) Neural Networks
- Homework 14: K-SVD

Expand