TTIC 31230: Fundamentals of Deep Learning

David McAllester

Winter 2019

In 2019 there were no machine problems or class projects. The course was treated as an "algorithms class" rather than a "programming class".

Exam solutions can be found at the end below. In addition, each lecture has associated problems.

Lectures Slides and Course Material:

  1. Information Theory: The Fundamental Equations of Deep Learning
  2. Back-Propagation and Frameworks
  3. Convolutional Neural Networks (CNNs)
  4. Trainability: Initialization, Batch Normalization, ResNet and Gated RNNs
  5. Language Modeling, Machine Translation and Attention
  6. Variants of Stochastic Gradient Descent (SGD)
  7. Generalization and Regularization
  8. Connectionist Temporal Classification (CTC)
  9. Deep Graphical Models
  10. More Information Theory: Avoiding Differential Entropy
  11. Rate-Distortion Autoencoders (RDAs)
  12. Expectation Maximization (EM), The Evidence Lower Bound (The ELBO) and Variational Autoencoders (VAEs)
  13. Generative Adversarial Networks (GANs)
  14. Pretraining
  15. Reinforcement Learning (RL)
  16. AlphaZero
  17. Gradients as Dual Vectors, Hessian-Vector Products, and Information Geometry
  18. The Black Box Problem
  19. The Quest for Artificial General Intelligence (AGI)

Exam solutions

  1. Quiz 1
  2. Quiz 2
  3. Quiz 3
  4. Quiz 4
  5. Final Exam