MAS6004 Inference

Note: This is an old module occurrence.

You may wish to visit the module list for information on current teaching.

Both semesters, 2011/12 20 Credits
Lecturer: Prof Paul Blackwell uses MOLE Reading List
Aims Outcomes Teaching Methods Assessment Full Syllabus

This unit is largely concerned with practical statistical inference. Modern computational tools for the implementation of the frequentist and likelihood-based approaches to inference are explored, with strong emphasis placed on the use of simulation and Monte Carlo methods. Statistical theory is also developed with an introduction to the Bayesian approach to inference and decision making. Computational methods for practical Bayesian inference will also be covered.

There are no prerequisites for this module.
No other modules have this module as a prerequisite.

Outline syllabus

  • Semester 1: Bayesian Statistics
    • Subjective probability.
    • Inference using Bayes Theorem. Prior distributions. Exponential families. Conjugacy. Exchangeability.
    • Predictive inference.
    • Utility and decisions. Tests and interval estimation from a decision-theoretic perspective.
    • Model checking. Robustness. Sensitivity. Bayes factors for model checking.
    • Hierarchical models.
    • Computation. Gibbs sampling. Metropolis-Hastings. Graphical models. Case studies.
  • Semester 2: Computational inference
    • Kernel density estimation.
    • Computational methods for likelihoods. Profile likelihood.
    • Simulation. Generating techniques. Monte Carlo integration and variance reduction.
    • Bootstrapping.
    • Simulation and Monte Carlo testing. Randomization tests.


  • To extend understanding of the practice of statistical inference.
  • To familiarize the student with ideas, techniques and some uses of statistical simulation.
  • To describe computational implementation of likelihood-based analyses.
  • To introduce examples of modern computer-intensive statistical techniques.
  • To familiarize the student with the Bayesian approach to inference.
  • To describe computational implementation of Bayesian analyses.

Learning outcomes

  • appreciate the versatility of simulation methods in statistical inference,
  • be able to use simulation methods in hypothesis/significance testing and interval estimation,
  • understand the idea behind the Expectation-Maximisation Algorithm and apply it to practical problems,
  • understand and be able to use the technique of kernel density estimation,
  • understand and apply Bayesian ideas of prior-posterior updating,
  • understand the concepts of utility and maximization of expected utility in decision making,
  • understand the idea of Gibbs sampling and apply it to practical problems in Bayesian inference.

Teaching methods

Lectures, with a complete set of printed notes, plus task and exercise sheets. Practical sessions using the packages R and WinBUGS.

34 lectures, no tutorials, 6 practicals


Two projects (30%), and a three-hour restricted open book examination (70%).

Full syllabus

Bayesian theory

  • The subjective interpretation of probability. Constructing subjective probabilities.
  • Independence and exchangeability. (1 session)
  • Inference using Bayes Theorem. Discrete examples. (1 session)
  • Prior distributions. Exponential families. Conjugacy. (1 session)
  • Continuous examples: normally distributed data with known variance, binomial data. (1 session)
  • Continuous examples: poisson and normal distributions with unknown variance. (1 session)
  • Predictive inference. (1 session)
Decision theory and its role in inference
  • Utility and decisions. Maximising expected utility. (2 sessions)
  • Point estimation, interval estimation and hypothesis testing from a decision-theoretic perspective. (2 sessions)
Bayesian modelling
  • Hierarchical models (2 sessions)
  • Model checking. Robustness. Sensitivity. Bayes factors for model comparisons. (2 sessions)
Bayesian computation with Markov Chain Monte Carlo (MCMC) methods
  • Gibbs sampling, graphical models (2 sessions)
  • MCMC using WinBugs (1 session)
  • WinBugs practicals: case studies. (2 sessions)
Introduction to Monte Carlo methods
  • Monte Carlo integration (1 session)
  • Variance reduction techniques: antithetic variables, control variables, conditioning (1 session)
Simulation methods in statistical inference
  • Randomization tests, confidence intervals based on randomization (2 sessions)
  • Monte Carlo tests (2 sessions)
  • Bootstrap methods; significance tests and bootstrap confidence intervals (3 sessions)
Random number generation
  • Congruential generators, the inversion method (1 session)
  • The rejection method, adaptive rejection (2 sessions)
Computational methods for likelihood-based inference
  • The E-M algorithm (4 sessions)
  • Profile likelihood (1 session)
Non-parametric density estimation
  • Kernel density estimation (3 sessions)
  • Further reading and case studies (inter-semester break)

Reading list

Type Author(s) Title Library Blackwells Amazon
B Garthwaite, P.H., Jolliffe, I.T. \& Jones, B. Statistical Inference Blackwells Amazon
B Gelman, A., Carlin, J.B., Stern, H.S. \& Rubin, D.B. Bayesian Data Analysis Blackwells Amazon
B Kalbfleisch, J.G. Probability and Statistical Inference Blackwells Amazon
B Lee, P. M. Bayesian Statistics: An Introduction Blackwells Amazon
B Lee, P. M. Bayesian Statistics: An Introduction Blackwells Amazon
B Morgan, B. J. T. Elements of Simulation Blackwells Amazon
B Tanner, M.A. Tools for Statistical Inference Blackwells Amazon
B Tanner, M.A. Tools for Statistical Inference Blackwells Amazon

(A = essential, B = recommended, C = background.)

Most books on reading lists should also be available from the Blackwells shop on Mappin Street.