վٱ:
Instructor:
Start Date:
Price:(optional upgrade cost available to include graded assignments and completion certificate)
What you will learn
- Bayes’ Theorem. Differences between classical (frequentist) and Bayesian inference.
- Posterior inference: summarizing posterior distributions, credible intervals, posterior probabilities, posterior predictive distributions and data visualization.
- Gamma-poisson, beta-binomial and normal conjugate models for data analysis.
- Bayesian regression analysis and analysis of variance (ANOVA).
- Use of simulations for posterior inference. Simple applications of Markov chain-Monte Carlo (MCMC) methods and their implementation in R.
- Bayesian cluster analysis.
- Model diagnostics and comparison.
- Make sure to answer the actual research question rather than “apply methods to the data”.
- Using latent (unobserved) variables and dealing with missing data.
- Multivariate analysis within the context of mixed effects linear regression models. Structure, assumptions, diagnostics and interpretation. Posterior inference and model selection.
- Why Monte Carlo integration works and how to implement your own MCMC Metropolis-Hastings algorithm in R.
- Bayesian model averaging in the context of change-point problem. Pinpointing the time of change and obtaining uncertainty estimates for it.