Alt text

Introduction

This course introduces the core concepts of Bayesian statistics, the theoretical properties of Bayesian estimators and their use in decision-making under uncertainty. You will learn key Bayesian modelling approaches and advanced computational techniques tailored to the needs and structure of different models.

We begin with parametric models to introduce the core ideas of Bayesian analysis, covering prior specification, inference (point estimation, credible sets, and hypothesis testing), its decision-theoretic foundations, and key asymptotic results such as posterior consistency and the Bernstein–von Mises theorem. We will also study modern computational methods for Bayesian inference and introduce more advanced topics, including high-dimensional and nonparametric models. For these models, we will focus primarily on prior construction and on results about convergence rates and adaptation.

More details can be found in syllabus, quercus and piazza (Access Code: n0px27jcbmb).

Announcements

  • Please find here the solutions for the midterm!
  • Solutions for the practice midterm are out!
  • Practice midterm is out! Solutions will follow soon.
  • Lectures begin on Jan 8!

Instructor

  • Thibault Randrianarisoa, Office: IA 4064

Teaching Assistants

Hanlong Chen (sl.chen@mail.utoronto.ca)

  • He will handle all questions related to the practice final exam.

Time & Location

Thursday, 3:00 PM - 6:00 PM

In Person: IA 1160

Suggested Reading

The course will be based on some of the content of the book Bayesian Data Analysis (BDA) by Gelman, Carlin, Stern, Dunson, Vehtari & Rubin. It is freely available online on home page for the book https://sites.stat.columbia.edu/gelman/book/, which also contains additional material (lecture notes, code demo,…).

Additional suggested readings are:

Lectures and (tentative) timeline

Week Lectures Suggested reading Problems Timeline
Week 1
5-11 January
Introduction and reminders of Statistics and Probability

Annotated slides
Sec. 1.2, 1.3, 1.8 (+ Appendix A for reminders)
Likelihood and sufficiency principle
Sheet 1
Solutions
Week 2
12-18 January
Choice of priors, Aspects of the posterior

Annotated slides
Sec. 1.5, 2.1, 2.4, 2.5, 2.8 Sheet 2
Solutions
Quizz 1
Week 3
19-25 January
Decision theory

Annotated slides

Erratum
Sec. 9.1 Sheet 3
Solutions
Quizz 2
Week 4
26 January-1 February
Bayesian tests, Model selection

Annotated slides
Sheet 4
Solutions
Quizz 3
Week 5
2–8 February
Sampling Algorithms

Annotated slides
Quizz 4
Week 6
9–15 February
Variational Bayes

Annotated slides
Sheet 5
Solutions
Quizz 5
Week 7
16-22 February
Reading Week
Week 8
23 February – 1 March
Midterm Recording (MCMC)
Week 9
2–8 March
Asymptotic properties in parametric Bayesian models Sheet 6
Solutions
Week 10
9–15 March
Priors for high-dimensional models
Week 11
16–22 March
Dirichlet process
Week 12
23–29 March
Gaussian processes
Week 13
30 March - 5 April
Asymptotics in Bayesian nonparametrics

Homeworks

Homework # Out Due Solutions
Assignment 1 March $2^{\text{nd}}$ March $15^{\text{th}}$, 23:59
Assignment 2 TBD TBD

Computing Resources

For the homework assignments, we will primarily use Python, and libraries such as NumPy, SciPy, and scikit-learn. You have two options:

  • The easiest option is run everything on Google Colab.
  • Alternatively, you can install everything yourself on your own machine.
    • If you don’t already have python, install using Anaconda.
    • Use pip to install the required packages pip install scipy numpy autograd matplotlib jupyter sklearn
  • For those unfamiliar with Numpy, there are many good resources, e.g. Numpy tutorial and Numpy Quickstart.