Theoretical Machine Learning

COS511 @ Princeton

Description

This is the page for a course at Princeton taught by the fantastic Prof. Chi Jin that I took during Fall 2023. Below is the course description and some resources about the topic.

“The course covers fundamental results in statistical learning theory: 1. Supervised learning: generalization, uniform concentration, empirical risk minimizer, Rademacher complexity, VC theory, reproducing Hilbert kernel space and several applications including neural networks, sparse linear regression, and low-rank matrix problems; 2. Online learning: sequential Rademacher complexity, littlestone dimension, online algorithms and applications; 3. Unsupervised learning: latent variable models, maximum likelihood estimation, method of moments, tensor methods.”

Reading List

  • Shai Shalev-Shwartz and Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms
  • Prof. Jin’s course notes at this website
  • solid course notes from Cornell

Problem Sets

  • My problem sets! I don’t think any are really awfully wrong, but some are incorrect for sure. Email me if there’s anything to discuss: