MAS464 Bayesian Statistics
|Semester 1, 2017/18||10 Credits|
|Lecturer:||Dr Miguel Juarez||Home page||Timetable||Reading List|
|Aims||Outcomes||Teaching Methods||Assessment||Full Syllabus|
This unit develops the Bayesian approach to statistical inference. The Bayesian method is fundamentally different in philosophy from conventional frequentist/classical inference, and has been the subject of some controversy in the past. It is, however, becoming increasingly popular in many fields of applied statistics. This course will cover both the foundations of Bayesian statistics, including subjective probability, utility and decision theory, and modern computational tools for practical inference problems, specifically Markov chain Monte Carlo methods and Gibbs sampling. Applied Bayesian methods will be demonstrated in a series of case studies using the software package R.
Prerequisites: MAS223 (Statistical Inference and Modelling)
Not with: MAS364 (Bayesian Statistics)
No other modules have this module as a prerequisite.
- Subjective probability.
- Inference using Bayes Theorem. Prior distributions. Exponential families. Conjugacy. Exchangeability.
- Predictive inference.
- Utility and decisions. Tests and interval estimation from a decision-theoretic perspective.
- Hierarchical models.
- Computation. Gibbs sampling. Metropolis-Hastings. Case studies.
- Linear regression
- To extend understanding of the practice of statistical inference.
- To familiarize the student with the Bayesian approach to inference.
- To describe computational implementation of Bayesian analyses.
- Carry out Bayesian analysis for a range of standard statistical problems.
- Apply the Bayesian approach to straightforward novel situations.
- Use Bayesian computational software, e.g. R, for realistically complex problems and interpret the results in context.
Lectures, problem solving
20 lectures, no tutorials, 4 practicals
One formal 2 hour written examination [70%]. Format: 3 questions from 4. Continuous assessment [10%]; two assignments each worth 5%. Project [20%].
- The subjective interpretation of probability. Constructing subjective probabilities.
- Independence and exchangeability.
- Inference using Bayes Theorem. Discrete examples.
- Prior distributions. Exponential families. Conjugacy.
- Continuous examples: normally distributed data with known variance, binomial data.
- Continuous examples: poisson and normal distributions with unknown variance.
- Predictive inference.
- Utility and decisions. Maximising expected utility.
- Point estimation, interval estimation and hypothesis testing from a decision-theoretic perspective.
- Hierarchical models
- Model checking. Robustness. Sensitivity. Bayes factors for model comparisons.
- Gibbs sampling, graphical models
- MCMC using R
- R practicals: case studies.
|B||Gelman, Carlin, Stern and Rubin||Bayesian Data Analysis||519.42 (W)||Blackwells||Amazon|
|B||Lee||Bayesian Statistics: An Introduction||519.542 (L)||Blackwells||Amazon|
(A = essential, B = recommended, C = background.)
Most books on reading lists should also be available from the Blackwells shop at Jessop West.
|Thu||12:00 - 12:50||lecture||Hicks Lecture Theatre 2|
|Fri||14:00 - 14:50||lecture||Hicks Lecture Theatre 2|