MAS6004 Inference

Both semesters, 2017/18 20 Credits
Lecturer: Dr Miguel Juarez uses MOLE Timetable Reading List
Aims Outcomes Teaching Methods Assessment Full Syllabus

This unit is largely concerned with practical statistical inference. Modern computational tools for the implementation of the frequentist and likelihood-based approaches to inference are explored, with strong emphasis placed on the use of simulation and Monte Carlo methods. Statistical theory is also developed with an introduction to the Bayesian approach to inference and decision making. Computational methods for practical Bayesian inference will also be covered.

There are no prerequisites for this module.
No other modules have this module as a prerequisite.


Outline syllabus

  • Semester 1: Bayesian Statistics
    • Subjective probability.
    • Inference using Bayes Theorem. Prior distributions. Exponential families. Conjugacy. Exchangeability.
    • Predictive inference.
    • Utility and decisions. Tests and interval estimation from a decision-theoretic perspective.
    • Hierarchical models.
    • Computation. Gibbs sampling. Metropolis-Hastings. Case studies.
    • Linear regression
  • Semester 2: Computational inference
    • Kernel density estimation.
    • Computational methods for likelihoods. Profile likelihood.
    • Simulation. Generating techniques. Monte Carlo integration and variance reduction.
    • Bootstrapping.
    • Simulation and Monte Carlo testing. Randomization tests.

Office hours

Wednesday 13:30-15:30



Aims

  • To extend understanding of the practice of statistical inference.
  • To familiarize the student with ideas, techniques and some uses of statistical simulation.
  • To describe computational implementation of likelihood-based analyses.
  • To introduce examples of modern computer-intensive statistical techniques.
  • To familiarize the student with the Bayesian approach to inference.
  • To describe computational implementation of Bayesian analyses.

Learning outcomes

  • appreciate the versatility of simulation methods in statistical inference,
  • be able to use simulation methods in hypothesis/significance testing and interval estimation,
  • understand the idea behind the Expectation-Maximisation Algorithm and apply it to practical problems,
  • understand and be able to use the technique of kernel density estimation,
  • understand and apply Bayesian ideas of prior-posterior updating,
  • understand the concepts of utility and maximization of expected utility in decision making,
  • understand the idea of Gibbs sampling and apply it to practical problems in Bayesian inference.

Teaching methods

Lectures, with a complete set of printed notes, plus task and exercise sheets. Practical sessions using R.


34 lectures, no tutorials, 6 practicals

Assessment

One project (S1, 15%), three assessed exercises (S2, 5% each), and a three-hour examination (70%). NB: The exam is NOT restricted open book.

Full syllabus

Bayesian theory

  • The subjective interpretation of probability. Constructing subjective probabilities.
  • Independence and exchangeability.
  • Inference using Bayes Theorem. Discrete examples.
  • Prior distributions. Exponential families. Conjugacy.
  • Continuous examples: normally distributed data with known variance, binomial data.
  • Continuous examples: poisson and normal distributions with unknown variance.
  • Predictive inference.
Decision theory and its role in inference
  • Utility and decisions. Maximising expected utility.
  • Point estimation, interval estimation and hypothesis testing from a decision-theoretic perspective.
Bayesian modelling
  • Hierarchical models
  • Model checking. Robustness. Sensitivity.
Bayesian computation with Markov Chain Monte Carlo (MCMC) methods
  • Gibbs sampling.
  • MCMC using R
  • R practicals: case studies. (practical sessions)
(inter-semester break)
Introduction to Monte Carlo methods
  • Monte Carlo integration (1 session)
  • Variance reduction techniques: antithetic variables, control variables, conditioning (1 session)
Simulation methods in statistical inference
  • Randomization tests, confidence intervals based on randomization (2 sessions)
  • Monte Carlo tests (2 sessions)
  • Bootstrap methods; significance tests and bootstrap confidence intervals (3 sessions)
Random number generation
  • Congruential generators, the inversion method (1 session)
  • The rejection method, adaptive rejection (2 sessions)
Computational methods for likelihood-based inference
  • The E-M algorithm (4 sessions)
  • Profile likelihood (1 session)
Non-parametric density estimation
  • Kernel density estimation (3 sessions)
  • Further reading and case studies

Reading list

Type Author(s) Title Library Blackwells Amazon
B Garthwaite, P.H., Jolliffe, I.T. \& Jones, B. Statistical Inference Blackwells Amazon
B Gelman, A., Carlin, J.B., Stern, H.S. \& Rubin, D.B. Bayesian Data Analysis Blackwells Amazon
B Kalbfleisch, J.G. Probability and Statistical Inference Blackwells Amazon
B Lee, P. M. Bayesian Statistics: An Introduction Blackwells Amazon
B Lee, P. M. Bayesian Statistics: An Introduction Blackwells Amazon
B Morgan, B. J. T. Elements of Simulation Blackwells Amazon
B Tanner, M.A. Tools for Statistical Inference Blackwells Amazon
B Tanner, M.A. Tools for Statistical Inference Blackwells Amazon

(A = essential, B = recommended, C = background.)

Most books on reading lists should also be available from the Blackwells shop at Jessop West.

Timetable (semester 1)

Thu 12:00 - 12:50 lecture   Diamond Building LT7
Fri 14:00 - 14:50 lecture   Hicks Lecture Theatre 7