Note: This is an old module occurrence.
You may wish to visit the module list for information on current teaching.
|Both semesters, 2014/15||20 Credits|
|Lecturer:||Dr Miguel Juarez||uses MOLE||Timetable||Reading List|
|Aims||Outcomes||Teaching Methods||Assessment||Full Syllabus|
This unit is largely concerned with practical statistical inference. Modern computational tools for the implementation of the frequentist and likelihood-based approaches to inference are explored, with strong emphasis placed on the use of simulation and Monte Carlo methods. Statistical theory is also developed with an introduction to the Bayesian approach to inference and decision making. Computational methods for practical Bayesian inference will also be covered.
There are no prerequisites for this module.
No other modules have this module as a prerequisite.
- Semester 1: Bayesian Statistics
- Subjective probability.
- Inference using Bayes Theorem. Prior distributions. Exponential families. Conjugacy. Exchangeability.
- Predictive inference.
- Utility and decisions. Tests and interval estimation from a decision-theoretic perspective.
- Model checking. Robustness. Sensitivity. Bayes factors for model checking.
- Hierarchical models.
- Computation. Gibbs sampling. Metropolis-Hastings. Graphical models. Case studies.
- Semester 2: Computational inference
- Kernel density estimation.
- Computational methods for likelihoods. Profile likelihood.
- Simulation. Generating techniques. Monte Carlo integration and variance reduction.
- Simulation and Monte Carlo testing. Randomization tests.
- To extend understanding of the practice of statistical inference.
- To familiarize the student with ideas, techniques and some uses of statistical simulation.
- To describe computational implementation of likelihood-based analyses.
- To introduce examples of modern computer-intensive statistical techniques.
- To familiarize the student with the Bayesian approach to inference.
- To describe computational implementation of Bayesian analyses.
- appreciate the versatility of simulation methods in statistical inference,
- be able to use simulation methods in hypothesis/significance testing and interval estimation,
- understand the idea behind the Expectation-Maximisation Algorithm and apply it to practical problems,
- understand and be able to use the technique of kernel density estimation,
- understand and apply Bayesian ideas of prior-posterior updating,
- understand the concepts of utility and maximization of expected utility in decision making,
- understand the idea of Gibbs sampling and apply it to practical problems in Bayesian inference.
Lectures, with a complete set of printed notes, plus task and exercise sheets. Practical sessions using R.
34 lectures, no tutorials, 6 practicals
Two projects (30%), and a three-hour examination (70%). NB: The exam this year is NOT restricted open book.
- The subjective interpretation of probability. Constructing subjective probabilities.
- Independence and exchangeability. (1 session)
- Inference using Bayes Theorem. Discrete examples. (1 session)
- Prior distributions. Exponential families. Conjugacy. (1 session)
- Continuous examples: normally distributed data with known variance, binomial data. (1 session)
- Continuous examples: poisson and normal distributions with unknown variance. (1 session)
- Predictive inference. (1 session)
- Utility and decisions. Maximising expected utility. (2 sessions)
- Point estimation, interval estimation and hypothesis testing from a decision-theoretic perspective. (2 sessions)
- Hierarchical models (2 sessions)
- Model checking. Robustness. Sensitivity. Bayes factors for model comparisons. (2 sessions)
- Gibbs sampling, graphical models (2 sessions)
- MCMC using R (1 session)
- R practicals: case studies. (2 sessions)
- Monte Carlo integration (1 session)
- Variance reduction techniques: antithetic variables, control variables, conditioning (1 session)
- Randomization tests, confidence intervals based on randomization (2 sessions)
- Monte Carlo tests (2 sessions)
- Bootstrap methods; significance tests and bootstrap confidence intervals (3 sessions)
- Congruential generators, the inversion method (1 session)
- The rejection method, adaptive rejection (2 sessions)
- The E-M algorithm (4 sessions)
- Profile likelihood (1 session)
- Kernel density estimation (3 sessions)
- Further reading and case studies (inter-semester break)
|B||Box, G.E.P and Tiao, G.C.||Bayesian inference in statistical analysis||519.42 (B)||Blackwells||Amazon|
|B||DeGroot, M.H. and Schervish, M.J.||Probability and statistics||Blackwells||Amazon|
|B||Gelman, A., Carlin, J.B., Stern, H.S. \& Rubin, D.B.||Bayesian Data Analysis||Blackwells||Amazon|
|B||Kalbfleisch, J.G.||Probability and Statistical Inference||Blackwells||Amazon|
|B||Lee, P. M.||Bayesian Statistics: An Introduction||Blackwells||Amazon|
|B||Morgan, B. J. T.||Elements of Simulation||Blackwells||Amazon|
|B||Tanner, M.A.||Tools for Statistical Inference||Blackwells||Amazon|
(A = essential, B = recommended, C = background.)
Most books on reading lists should also be available from the Blackwells shop at Jessop West.
Timetable (semester 2)
|Wed||09:00 - 09:50||lecture||Hicks Lecture Theatre 6|
|Fri||09:00 - 09:50||lecture||Hicks Lecture Theatre C|