STATS 579 – Intermediate Bayesian Modeling

Intermediate Bayesian Modeling builds on the material from STATS 477 / 577 by providing a deeper exploration of Bayesian inference and Monte Carlo methods. This course includes proofs as well as more detailed mathematical treatments of key Bayesian results. The Metropolis-Hastings algorithm is introduced and students are guided to program their own sampling algorithms.

Course Information

Room: Science & Math Learning Center, Rm. 356

Time: Tuesday & Thursday, 2:00pm – 3:15pm

Prerequisites: STATS 477 / 577 – Introduction to Bayesian Modeling


Syllabus: fa19syllabus.pdf

Readings and Demonstrations

De Finetti on Exchangeability

The material here is derived from a series of lectures De Finetti gave at the Henri Poincaré Institute (IHP) in 1935. De Finetti's Theorem is expounded and proven in Chapter 3, with further discussion of exchangeability continuing in Chapters 4 and 5. Earlier, in Chapters 1 and 2, De Finetti gives his own development of the notion of probablity—which makes for entertaining reading, but is not of critical importance for this class. Chapter 6 provides a philosophical summation of De Finetti's perspective and how this work fits into that perspective.


Dr. Christensen's Model Selection Review (Dissertation)

Chapter 3 of my dissertation included a review of many of the model selection tools we've discussed in class, including a proof for the asymptotic equivalence of DIC and AIC under certain conditions. If you're looking for a good textual review of what I've done in lecture, this should work well.


Cavanaugh on AIC
Cavanaugh & Neath on BIC

UIowa's Joe Cavanaugh is an expert on model selection and information criteria. He is particularly good at distilling difficult mathematical arguments into easy-to-follow derivations. In the above papers, he lays out the detailed theoretical justifications for AIC and BIC in a way that should be understandable to well prepared students of statistics.


DIC for Missing Data Models

Celeux et al. (2006) provide an in-depth discussion of how DIC can be operationalized for missing data models (including mixed models). Aside from its application to DIC, this is a good read purely for how it asks the reader to think harder about the deeper structure of missing data models.


Visual Demonstration of MC Methods

Chi Feng, a research assistant at MIT's computational design laboratory, created an interactive gallery for visualizing different Monte Carlo sampling algorithms. Many of the algorithms demonstrated here are ones we haven't talked about in class, but this can help us visualize the behavior of the Metropolis algorithm as well as Hamiltonian Monte Carlo.


Mustafa Salman's Version

One of my Bayes students, Mustafa Salman, forked Chi Feng's original code and added a rejection rate display in the top-left corner. This allows us to view the same demonstrations, but to also better understand how modifying the tuning parameters affects the rejection rates. Good for understanding optimality issues. (GitHub code available here)

Assignments

Homework

Homework 1 (tex) – Due 26 September (Solutions available)

Homework 2 (tex) – Due 17 October (Solutions available)

Homework 3 (tex) – Due 7 November (Solutions available)

Homework 4 (tex) – Not Assigned (Solutions available)

Final Project

Instructions (tex) – Due 13 December by 12pm (noon)

citations.csv – Data file for final project



Fletcher G.W. Christensen

Asst. Professor of Statistics

Office: SMLC 328

Fall 2019 Office Hours:
  • Tuesday 12:30pm – 1:30pm
  • Thursday 12:30pm – 1:30pm