Introduction
This course introduces the core concepts of Bayesian statistics, the theoretical properties of Bayesian estimators and their use in decision-making under uncertainty. You will learn key Bayesian modelling approaches and advanced computational techniques tailored to the needs and structure of different models.
We begin with parametric models to introduce the core ideas of Bayesian analysis, covering prior specification, inference (point estimation, credible sets, and hypothesis testing), its decision-theoretic foundations, and key asymptotic results such as posterior consistency and the Bernstein–von Mises theorem. We will also study modern computational methods for Bayesian inference and introduce more advanced topics, including high-dimensional and nonparametric models. For these models, we will focus primarily on prior construction and on results about convergence rates and adaptation.
More details can be found in syllabus, quercus and piazza (Access Code: n0px27jcbmb).
Announcements
- Lectures begin on Jan 8!
Instructor
- Thibault Randrianarisoa, Office: IA 4064
- Email: t.randrianarisoa@utoronto.ca (put “STA414” in the subject)
- Office hours: Thursday 10am–1pm
Teaching Assistants
TBA
- They will handle all questions related to homework assigments, the midterm and the final exam.
- Email: TBA (in the subject of the email indicate the scope: HW1, HW2, general, etc)
Time & Location
Thursday, 3:00 PM - 6:00 PM
In Person: IA 1160
Suggested Reading
The course will be based on some of the content of the book Bayesian Data Analysis (BDA) by Gelman, Carlin, Stern, Dunson, Vehtari & Rubin. It is freely available online on home page for the book https://sites.stat.columbia.edu/gelman/book/, which also contains additional material (lecture notes, code demo,…).
Additional suggested readings are:
- (TBC) Christian P. Robert (2007) The Bayesian Choice
- (MCSM) Christian P. Robert and George Casella (2004) Monte Carlo Statistical Methods
- (BC) Jean-Michel Marin and Christian P. Robert. (2007), Bayesian Core : A Practical Approach to Computational Bayesian Statistics
Lectures and (tentative) timeline
| Week | Lectures | Suggested reading | Problems | Timeline |
|---|---|---|---|---|
| Week 1 5-11 January |
Introduction and reminders of Statistics and Probability | PS1.pdf | ||
| Week 2 12-18 January |
Choice of priors, Aspects of the posterior | |||
| Week 3 19-25 January |
Decision theory | |||
| Week 4 26 January-1 February |
Bayesian tests, Model selection | |||
| Week 5 2–8 February |
Sampling Algorithms | |||
| Week 6 9–15 February |
Variational Bayes | |||
| Week 7 16-22 February |
Reading Week | |||
| Week 8 23 February – 1 March |
Midterm | |||
| Week 9 2–8 March |
Asymptotic properties in parametric Bayesian models | |||
| Week 10 9–15 March |
Priors for high-dimensional models | |||
| Week 11 16–22 March |
Dirichlet process | |||
| Week 12 23–29 March |
Gaussian processes | |||
| Week 13 30 March - 5 April |
Asymptotics in Bayesian nonparametrics |
Homeworks
| Homework # | Out | Due | TA Office Hours | Solutions |
|---|---|---|---|---|
| Assignment 1 | TBD | TBD | TBD | |
| Assignment 2 | TBD | TBD | TBD |
Computing Resources
For the homework assignments, we will primarily use Python, and libraries such as NumPy, SciPy, and scikit-learn. You have two options:
- The easiest option is run everything on Google Colab.
- Alternatively, you can install everything yourself on your own machine.
- If you don’t already have python, install using Anaconda.
- Use pip to install the required packages
pip install scipy numpy autograd matplotlib jupyter sklearn
- For those unfamiliar with Numpy, there are many good resources, e.g. Numpy tutorial and Numpy Quickstart.