Deep Learning Theory

ORF543 @ Princeton

Description

This is the page for a course at Princeton taught by the one-of-a-kind Prof. Boris Hanin I took during Fall 2022. Below is the course description and some resources about the topic.

“This course is an introduction to deep learning theory. Using tools from mathematics (e.g. probability, functional analysis, spectral asymptotics and combinatorics) as well as physics (e.g. effective field theory, the 1/n expansion, and the renormalization group) we cover topics in approximation theory, optimization, and generalization.”

It was a complete surprise to me that I even ended up taking this course, and it turned out to be the best thing ever. Prof. Hanin is incredibly knowledgeable and delightfully energetic at 9:30am, and the material was fantastic. Sadly, there are no combined notes for this course, so I think my notes might be the best ones out there :)

Reading List

  • Roberts, Yaida, Hanin: The Principles of Deep Learning Theory. we didn’t really follow this book at all, but we applied the perturbation/power expansion perspective of it numerous times and it is independently awesome

Notes & Problem Sets

  • My notes from lecture.
  • My problem sets! I don’t think any are really awfully wrong, but some are incorrect for sure. Email me if there’s anything to discuss:
  • Literature review for the midterm: here
  • The final project of my good friend David Shustin and myself: BEWARE, THE RESULTS ARE WRONG! we assumed something was independent when it wasnt, the real problem is more difficult. : link