Introduction
The language of probability allows us to coherently and automatically account for uncertainty. This course will teach you how to build, fit, and do inference in probabilistic models. These models let us generate novel images and text, find meaningful latent representations of data, take advantage of large unlabeled datasets, and even let us do analogical reasoning automatically. It will offer a broad view of model-building and optimization techniques that are based on probabilistic building blocks which will serve as a foundation for more advanced machine learning courses.
More details can be found in syllabus, quercus and piazza (Access Code: r7anu46950q).
Announcements
- Lectures begin on Jan 6!
Instructor
- Thibault Randrianarisoa, Office: UY 9179
- Email: t.randrianarisoa@utoronto.ca (put “STA414” in the subject)
- Office hours: Tuesday 9:30-11:30
Teaching Assistants
Yichen Ji, Shengzhuo Li, Liam Welsh, Yan Zhang, Amir Reza Peimani
- They will handle all questions related to homework assigments, the midterm and the final exam.
- Email: TBA (in the subject of the email indicate the scope: HW1, HW2, general, etc)
Time & Location
Tuesday, 6:00 PM - 9:00 PM
In Person: MC 102
Suggested Reading
No required textbooks. Some suggested readings are:
- (PRML) Christopher M. Bishop (2006) Pattern Recognition and Machine Learning
- (DL) Ian Goodfellow, Yoshua Bengio and Aaron Courville (2016), Deep Learning
- (MLPP) Kevin P. Murphy (2013), Machine Learning: A Probabilistic Perspective
- (ESL) Trevor Hastie, Robert Tibshirani, Jerome Friedman (2009) The Elements of Statistical Learning
- (ISL) Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani (2023) Introduction to Statistical Learning
- (ITIL) David MacKay (2003) Information Theory, Inference, and Learning Algorithms
- (PML1) Kevin P. Murphy (2022), Probabilistic Machine Learning: An Introduction
- (PML2) Kevin P. Murphy (2023), Probabilistic Machine Learning: Advanced topics
Lectures and (tentative) timeline
| Week | Lectures | Suggested reading | Tutorials | Video | Timeline |
|---|---|---|---|---|---|
| Week 1 5-11 January |
Introduction / Probabilistic Models | PML1 1.1-1.3 | Quizz 0 - Solutions | ||
| Week 2 12-18 January |
Directed Graphical Models / Decision theory | ||||
| Week 3 19-25 January |
Exact inference / Message Passing | ||||
| Week 4 26 January-1 February |
Hidden Markov Models / Monte-carlo Methods | ||||
| Week 5 2–8 February |
MCMC | ||||
| Week 6 9–15 February |
Variational Inference | ||||
| Week 7 16-22 February |
Reading Week | ||||
| Week 8 23 February – 1 March |
Midterm | ||||
| Week 9 2–8 March |
Neural Networks | ||||
| Week 10 9–15 March |
Gaussian Processes | ||||
| Week 11 16–22 March |
Embeddings/Attention/Transformers | ||||
| Week 12 23–29 March |
Variational Autoencoders | ||||
| Week 13 30 March - 5 April |
Diffusion Models |
Homeworks
| Homework # | Out | Due | TA Office Hours | Solutions |
|---|---|---|---|---|
| Assignment 1 | TBD | TBD | TBD | |
| Assignment 2 | TBD | TBD | TBD | |
| Assignment 3 | TBD | TBD | TBD | |
| Assignment 4 | TBD | TBD | TBD |
Computing Resources
For the homework assignments, we will primarily use Python, and libraries such as NumPy, SciPy, and scikit-learn. You have two options:
- The easiest option is run everything on Google Colab.
- Alternatively, you can install everything yourself on your own machine.
- If you don’t already have python, install using Anaconda.
- Use pip to install the required packages
pip install scipy numpy autograd matplotlib jupyter sklearn
- For those unfamiliar with Numpy, there are many good resources, e.g. Numpy tutorial and Numpy Quickstart.