This text reflects modern Bayesian statistical practice. Simulation is introduced in all the probability chapters and extensively used in the Bayesian material to simulate from the posterior and predictive distributions. One chapter describes the basic tenets of Metropolis and Gibbs sampling algorithms; however several chapters introduce the fundamentals of Bayesian inference for conjugate priors to deepen understanding. Strategies for constructing prior distributions are described in situations when one has substantial prior information and for cases where one has weak prior knowledge. One chapter introduces hierarchical Bayesian modeling as a practical way of combining data from different groups. There is an extensive discussion of Bayesian regression models including the construction of informative priors, inference about functions of the parameters of interest, prediction, and model selection.
The text uses JAGS (Just Another Gibbs Sampler) as a general-purpose computational method for simulating from posterior distributions for a variety of Bayesian models. An R package ProbBayes is available containing all of the book datasets and special functions for illustrating concepts from the book.
A complete solutions manual is available for instructors who adopt the book in the Additional Resources section.
2. Counting Methods
3. Conditional Probability
4. Discrete Distributions
5. Continuous Distributions
6. Joint Probability Distributions
7. Learning about a Binomial Probability
8. Modeling Measurement and Count Data
9. Simulation by Markov Chain Monte Carlo
10. Bayesian Hierarchical Modeling
11. Simple Linear Regression
12. Bayesian Multiple Regression and Logistic Models
13. Case Studies