Skip to content

isaquepim/BayesianStatisticsCourse

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

85 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bayesian Statistics

A PhD-level course at EMAp.

To compile the slides, you'll need to do

pdflatex -interaction=nonstopmode --shell-escape bayes_stats

a few times to get it right.

Pre-requisites

Books

Resources

  • An overview of computing techniques for Bayesian inference can be found here.
  • See Esteves, Stern and Izbicki's course notes.
  • Rafael Stern's excellent course.
  • Principles of Uncertainty by the inimitable J. Kadane is a book about avoiding being a sure loser. See this review by Christian Robert.
  • Bayesian Learning by Mattias Vilani is a book for a computation-minded audience.
  • Michael Betancourt's website is a treasure trove of rigorous, modern and insightful applied Bayesian statistics. See this as a gateway drug.
  • Awesome Bayes is a curated list of bayesian resources, including blog posts and podcasts.

Acknowledgements

Guido Moreira suggested topics, exercises and exam questions.

Syllabus

Lecture 0: Overview

Lecture 1: Principled Inference, decision-theoretical foundations

  • Berger and Wolpert's 1988 monograph is the definitive text on the Likelihood Principle (LP).
  • See this paper By Franklin and Bambirra for a generalised version of the LP.
  • As advanced reading, one can consider Birnbaum (1962) and a helpful review paper published 30 years later by Bjornstad.
  • Michael Evans has a few papers on the LP. See Evans, Fraser & Monette (1986) for an argument using a stronger version of CP and Evans, 2013 for a flaw with the original 1962 paper by Birnbaum.

Lecture 2: Belief functions, coherence, exchangeability

Lecture 3: Priors I: rationale and construction; conjugate analysis

Lecture 4: Priors II: types of priors; implementation

Required reading

Optional reading

Lecture 5: Bayesian point estimation

  • The paper The Federalist Papers As a Case Study by Mosteller and Wallace (1984) is a very nice example of capture-recapture models. It is cited in Sharon McGrayne's book "The Theory That Would Not Die" as triumph of Bayesian inference. It is also a serious contender for coolest paper abstract ever.
  • This post in the Andrew Gelman blog discusses how to deal with the sample size (n) in a Bayesian problem: either write out a full model that specifies a probabilistic model for n or write an approximate prior pi(theta|n).

Lecture 6: Bayesian Testing I

Required reading

Optional reading

Lecture 7: Bayesian Testing II

  • This paper by Lavine and Schervish provides a nice "disambiguation" for what Bayes factors can and cannot do inferentially.
  • Yao et al. (2018) along with ensuing discussion is a must-read for an understanding of modern prediction-based Bayesian model comparison.

Lecture 8: Asymptotics

  • The entry on the Encyclopedia of Mathematics on the Bernstein-von Mises theorem is nicely written.
  • The integrated nested Laplace approximation (INLA) methodology leverages Laplace approximations to provide accurate approximations to the posterior in latent Gaussian models, which cover a huge class of models used in applied modelling. This by Thiago G. Martins and others, specially section 2, is a good introduction.

Lecture 9: Applications I

  • Ever wondered what to do when both the number of trials and success probability are unknown in a binomial model? Well, this paper by Adrian Raftery has an answer. See also this discussion with JAGS and Stan implementations.
  • This case study shows how to create a model from first (physical) principles.

Lecture 10: Applications II

Lecture 11: Discussion Bayes vs Frequentism

Disclaimer: everything in this section needs to be read with care so one does not become a zealot!

About

PhD-level course at [EMAp](https://emap.fgv.br/en)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TeX 96.0%
  • R 3.3%
  • Other 0.7%