MAS6011 Dependent Data
Note: This is an old module occurrence.
You may wish to visit the module list for information on current teaching.
|Both semesters, 2011/12||20 Credits|
|Lecturer:||Dr Nick Fieller|
|Aims||Outcomes||Teaching Methods||Assessment||Full Syllabus|
The unit develops concepts and techniques for the analysis of data having the complex structure typical of many real applications. The two main themes are the analysis of observations on several dependent variables, and the analysis of dependent observations made over a period of time on a single variable. The unit begins with a practical introduction to multivariate analysis: Data Mining techniques such as summarizing and displaying high dimensional data and dimensionality reduction, principal components, multidimensional scaling, multivariate analysis of variance and discrimination. A review of repeated measures problems links to ideas of time series analysis. General techniques for the study of time series are developed, including structural descriptions, Box-Jenkins and state-space models and their fitting, and techniques for forecasting, covering local level, trend and seasonal time series. Emphasis is given to the practical implementation of the techniques using appropriate computer packages.
There are no prerequisites for this module.
No other modules have this module as a prerequisite.
- Semester 1: Multivariate Data Analysis
- Multivariate data summary
sample estimates of mean, covariance and variance
- Graphical displays
scatterplots, augmented plots, Andrews’ plots, special techniques.
- Exploratory analysis and dimensionality reduction
principal component analysis, principal component and crimcoord displays, implementation in R.
- Construction of statistical hypothesis tests
the likelihood ratio method and the union-intersection principle.
- Single and two-sample methods
Hotelling’s T2 test, practical implementation in R.
- Multisample methods
multivariate analysis of variance and connection with crimcoords, interpretation of R analyses.
- Discriminant analysis
probabilities of misclassification, likelihood rules, linear discriminant analysis in R.
- Repeated measures
examples, dependence structures, modelling.
- Multivariate data summary
- Semester 2: Time Series
- Preliminary material
examples; purposes of analysis; components (trend, cycle, seasonal, irregular); stationarity, autocorrelation; approaches to time series analysis.
- Simple descriptive methods
smoothing; decomposition; differencing; autocorrelation. Probability models for stationary series: autoregressive models; moving average models; partial autocorrelation; invertibility; ARMA processes; ARIMA models for non-stationary series. Inference: identification and fitting; diagnostics; Ljung-Box statistic; choice of models; AIC.
- Introduction to forecasting
updating and errors; linear predictions; heuristic forecasting methods.
- State space models
formulation; filtering, prediction and smoothing; Kalman recursions; Bayesian inference; Bayesian forecasting; local level model; linear trend model; seasonal model.
- Preliminary material
- To illustrate extensions of univariate statistical methodology to dependent data.
- To introduce some of the distinctive statistical methodologies which arise only in the analysis of dependent data.
- To illustrate how models for dependent data may be constructed and studied.
- To introduce students to some of the computational techniques required for multivariate and time series analyses.
- have some understanding of techniques of multivariate data summary and graphical display and of the principles of multivariate exploratory data analysis and dimensionality reduction;
- have some understanding of the construction of multivariate likelihood ratio tests and of the union-intersection principle in multivariate testing;
- be able to perform and interpret principal component analysis and linear discriminant analysis using a computer package;
- be able to understand the results of computer based multivariate analyses of one and two sample tests;
- be familiar with facilities offered by computer packages for multivariate analysis.
- understand general terms used in time series analysis, such as stationarity, autocorrelation function (ACF), and partial autocorrelation function (PACF).
- understand the structure of ARMA and ARIMA models and be able to derive the ACF and PACF for simple ARMA models.
- understand the approximate sampling behaviour of estimators of the ACF and PACF.
- be able to fit models to time series data, assess the fit, and if necessary modify the model, and be able to use the fitted models for forecasting.
- have undertaken an extended analysis of a problem involving a variety of multivariate methods and of a practical time series problem.
Lectures, with printed notes, plus task and exercise sheets and computer demonstrations. Some outside reading is also expected.
40 lectures, no tutorials
Two projects (30%) and a three-hour restricted open book examination (70%) .
Multivariate data summary
- basic notation, sample estimates of mean, covariance and variance (1 session)
- scatterplots, augmented plots, Andrews’ plots, special techniques (1 session)
- principal component analysis (2 sessions),
- principal component and crimcoord displays (2 sessions),
- implementation in R (1 session).
- revision of univariate likelihood ratio tests (1 session)
- the likelihood ratio method in multivariate data (2 sessions)
- the union-intersection principle (2 sessions).
- Hotelling’s T2 test (1 session)
- practical implementation in R (1 session).
- multivariate analysis of variance and connection with crimcoords (1 session)
- interpretation of R analyses (1 session).
- probabilities of misclassification (2 sessions)
- likelihood rules (1 session)
- linear discriminant analysis in R. (1 session)
- examples; purposes of analysis; components (trend, cycle, seasonal, irregular); stationarity, autocorrelation; approaches to time series analysis. (2 sessions)
- smoothing; decomposition; differencing; autocorrelation. (1 session) Probability models for stationary series: autoregressive models; (1 session)
- moving average models; partial autocorrelation; invertibility; (1 session)
- ARMA processes; (2 sessions)
- ARIMA models for non-stationary series. (1 session)
- identification and fitting; (1 session)
- diagnostics; Ljung-Box statistic; choice of models; AIC. (1 session)
- updating and errors; (2 sessions)
- linear predictions; heuristic forecasting methods. (2 sessions)
- formulation; filtering, prediction and smoothing; Kalman recursions. (2 sessions)
- Bayesian inference; Bayesian forecasting. (1 session)
- local level model; linear trend model; seasonal model. (3 sessions)