Bayesian Statistics and Hierarchical Bayesian Modeling for Psychological Science
Teaching materials for the award winning* BayesCog seminar at Faculty of Psychology, University of Vienna, as part of the Advanced Seminar for master's students (Mind and Brain track; recorded 2020 Summer Semester).
Instructor: Dr. Lei Zhang
Location: [virtually via Zoom]
When: 09:45-11:15 Wednesdays (see calendar below)
Recording: available on YouTube (also see below).
See also a Twitter thread (being liked 600+ times on Twitter) on the summary of the course.
* This course received a commendation award from the Society for the Improvement of Psychological Science (SIPS).
- Computational modeling and mathematical modeling provide an insightful quantitative framework that allows researchers to inspect latent processes and to understand hidden mechanisms. Hence, computational modeling has gained increasing attention in many areas of cognitive science and neuroscience (hence, cognitive modeling). One illustration of this trend is the growing popularity of Bayesian approaches to cognitive modeling. To this aim, this course teaches the theoretical and practical knowledge necessary to perform, evaluate and interpret Bayesian modeling analyses.
- This course is dedicated to introducing students to the basic knowledge of Bayesian statistics as well as basic techniques of Bayesian cognitive modeling. We will use R/RStudio and a newly developed statistical computing language - Stan to perform Bayesian analyses, ranging from simple binomial model and linear regression model to more complex hierarchical models.
L01: 18.03 Introduction and overview <slides> <video>
L02: 27.03 Introduction to R/RStudio I <slides> <video>
L03: 27.03 Introduction to R/RStudio II <slides> <video>
L04: 22.04 Probability and Bayes' theorem <slides> <video>
L05: 29.04 Linking data and parameter/model <slides> <video>
L06: 06.05 Grid approximation of Binomial model & intro to MCMC <slides> <video>
L07: 13.05 Intro to Stan I and Binomial model in Stan <slides> <video>
L08: 20.05 Intro to Stan II and Regression models in Stan <slides> <video>
L09: 27.05 Intro to cognitive modeling & Rescorla-Wagner model <slides> <video>
L10: 03.06 Implementing Rescorla-Wagner in Stan <slides> <video>
L11: 10.06 Hierarchical modeling + Stan optimization <slides> <video>
L12: 17.06 Model comparison <slides> <video>
L13: 24.06 Stan tips & debugging in Stan <slides> <video>
Folder | Task | Model |
---|---|---|
00.cheatsheet | NA | NA |
01.R_basics | NA | NA |
02.binomial_globe | Globe toss | Binomial Model |
03.bernoulli_coin | Coin flip | Bernoulli Model |
04.regression_height | Observed weight and height | Linear regression model |
05.regression_height_poly | Observed weight and height | Linear regression model |
06.reinforcement_learning | 2-armed bandit task | Simple reinforcement learning (RL) |
07.optm_rl | 2-armed bandit task | Simple reinforcement learning (RL) |
08.compare_models | Probabilistic reversal learning task | Simple and fictitious RL models |
09.debugging | Memory Retention | Exponential decay model |
- The distribution zoo: an interactive tool to build intuitions about common probability distributions.
- Probability distribution explorer: another interactive tool on probability distributions, with code in
Python
andStan
. - Michael Betancourt's blog post: comprehensive case studies using
Stan
. - The Stan Forums: A community to discuss Stan and Bayesian modeling.
[Journal articles]
- Kruschke, J. K., & Liddell, T. M. (2018). Bayesian data analysis for newcomers. Psychonomic bulletin & review, 25(1), 155-177.
- Wagenmakers, E. J., Marsman, M., Jamil, T., Ly, A., Verhagen, J., Love, J., ... & Matzke, D. (2018). Bayesian inference for psychology. Part I: Theoretical advantages and practical ramifications. Psychonomic bulletin & review, 25(1), 35-57.
- Daw, N. D. (2011). Trial-by-trial data analysis using computational models. Decision making, affect, and learning: Attention and performance XXIII, 23, 3-38.
- Etz, A., Gronau, Q. F., Dablander, F., Edelsbrunner, P. A., & Baribault, B. (2018). How to become a Bayesian in eight easy steps: An annotated reading list. Psychonomic Bulletin & Review, 25(1), 219-234.
- Ahn, W. Y., Haines, N., & Zhang, L. (2017). Revealing neurocomputational mechanisms of reinforcement learning and decision-making with the hBayesDM package. Computational Psychiatry, 1, 24-57.
[Books]
- McElreath, R. (2020). Statistical Rethinking: A Bayesian Course with Examples in R and Stan, 2nd Ed. CRC Press.
- Lambert, B. (2018). A Student’s Guide to Bayesian Statistics. Sage.
For bug reports, please contact Lei Zhang ([email protected], or @lei_zhang_lz).
Thanks to Markdown Cheatsheet and shields.io.
This license (CC BY-NC 4.0) gives you the right to re-use and adapt, as long as you note any changes you made, and provide a link to the original source. Read here for more details.