David McAllester

Winter 2019

In 2019 there were no machine problems or class projects. The course was treated as an "algorithms class" rather than a "programming class".

Exam solutions can be found at the end below. In addition, each lecture has associated problems.

Lectures Slides and Course Material:

- Information Theory: The Fundamental Equations of Deep Learning
- Back-Propagation and Frameworks
- Convolutional Neural Networks (CNNs)
- Trainability: Initialization, Batch Normalization, ResNet and Gated RNNs
- Language Modeling, Machine Translation and Attention
- Variants of Stochastic Gradient Descent (SGD)
- Generalization and Regularization
- Connectionist Temporal Classification (CTC)
- Deep Graphical Models
- More Information Theory: Avoiding Differential Entropy
- Rate-Distortion Autoencoders (RDAs)
- Expectation Maximization (EM), The Evidence Lower Bound (The ELBO) and Variational Autoencoders (VAEs)
- Generative Adversarial Networks (GANs)
- Pretraining
- Reinforcement Learning (RL)
- AlphaZero
- Gradients as Dual Vectors, Hessian-Vector Products, and Information Geometry
- The Black Box Problem
- The Quest for Artificial General Intelligence (AGI)

Exam solutions