Alt text

Introduction

The language of probability allows us to coherently and automatically account for uncertainty. This course will teach you how to build, fit, and do inference in probabilistic models. These models let us generate novel images and text, find meaningful latent representations of data, take advantage of large unlabeled datasets, and even let us do analogical reasoning automatically. It will offer a broad view of model-building and optimization techniques that are based on probabilistic building blocks which will serve as a foundation for more advanced machine learning courses.

More details can be found in syllabus, quercus and piazza (Access Code: r7anu46950q).

Announcements

  • Assignment 1 is out! it is due on Sunday, Feb 1st before 23:59PM.
  • The Office Hours for Tuesday, January 13th are exceptionally pushed back to 9:45 am – 11:45 am.
  • Lectures begin on Jan 6!

Instructor

  • Thibault Randrianarisoa, Office: UY 9087

Teaching Assistants

Yichen Ji, Shengzhuo Li, Liam Welsh, Yan Zhang, Amir Reza Peimani

  • They will handle all questions related to homework assigments, the midterm and the final exam.
  • Email: TBA (in the subject of the email indicate the scope: HW1, HW2, general, etc)

Time & Location

Tuesday, 6:00 PM - 9:00 PM

In Person: MC 102

Suggested Reading

No required textbooks. Some suggested readings are:

Lectures and (tentative) timeline

Week Lectures Suggested reading Tutorials Video Timeline
Week 1
5-11 January
Introduction / Probabilistic Models PML1 1.1-1.3
PML1 3.4, 4.2
Tutorial 1 Quizz 0 - Solutions
Week 2
12-18 January
Directed Graphical Models / Decision theory
Annotated slides
PRML 1.5
PML2 4.2
Quizz 1 - Solutions
Week 3
19-25 January
Exact inference / Message Passing PML2 9.3, 9.5 Tutorial 2 Quizz 2 - Solutions
Week 4
26 January-1 February
Hidden Markov Models / Monte-carlo Methods PML2 9.2.1, 29.2.1-29.2.4
PML2 11.1-11.5
HMMs (Colab)
Week 5
2–8 February
MCMC PML2 2.6, 12.1-12.6
Week 6
9–15 February
Variational Inference PML2 5.1, 6.5.3, 10.1-10.3
David Blei’s review of VI
Week 7
16-22 February
Reading Week
Week 8
23 February – 1 March
Midterm
Week 9
2–8 March
Neural Networks PML1 8.2, 13.1-13.3, 13.5.7
PML2 6.1-6.3
Andrej Karpathy’s recipe for training NNs
Week 10
9–15 March
Gaussian Processes PML1 17.2
PML2 18.1-5, 18.7
Week 11
16–22 March
Embeddings/Attention/Transformers PML1 15.4-15.5
PML2 16.2.7, 16.3.5
Week 12
23–29 March
Variational Autoencoders PML2 16.3.3, 21
Week 13
30 March - 5 April
Diffusion Models PML2 25

Homeworks

Homework # Out Due TA Office Hours Solutions
Assignment 1 January 19th Feb 1$^{st}$, at 23:59PM TBD
Assignment 2 TBD TBD TBD
Assignment 3 TBD TBD TBD
Assignment 4 TBD TBD TBD

Project

Project guidelines for graduate students can be found here.

Computing Resources

For the homework assignments, we will primarily use Python, and libraries such as NumPy, SciPy, and scikit-learn. You have two options:

  • The easiest option is run everything on Google Colab.
  • Alternatively, you can install everything yourself on your own machine.
    • If you don’t already have python, install using Anaconda.
    • Use pip to install the required packages pip install scipy numpy autograd matplotlib jupyter sklearn
  • For those unfamiliar with Numpy, there are many good resources, e.g. Numpy tutorial and Numpy Quickstart.