MAS190 Introduction to Probability and Statistics 1 (NJTech)
Note: Information for future academic years is provisional. Timetable information and teaching staff are especially likely to change, but other details may also be altered, some courses may not run at all, and other courses may be added.
Note: This module is not freely available for students to register. Most likely it is only available to external repeat students without lectures running. Please contact the School of Maths Assessment Coordinator, Dr Jayanta Manoharmayum, if you need more details.
|Semester 2, 2018/19||10 Credits|
|Lecturer:||Dr Jonathan Potts||Reading List|
This module is the first of two providing an introduction to the fields of probability and statistics, which form the basis of much of applicable mathematics and operations research. The theory behind probability and statistics will be introduced, along with examples occurring in such diverse areas as medicine, finance, sport, the environment, law and so on. Some of the computational statistical work will make use of the statistics package R.
There are no prerequisites for this module.
No other modules have this module as a prerequisite.
- Introduce students to the theory of probability, including applications to practical examples;
- interpret and perform calculations involving random variables and distributions;
- recognise important standard distributions;
- apply the idea of conditional probability via the law of total probability and Bayes' rule;
- use the software package R for simple calculations, plots, and working with standard distributions;
32 lectures, 32 tutorials
One formal 2 hour written examination. All questions compulsory.
- Statistical and probabilistic modelling, and the need for a mathematical theory of chance.
- Sets, unions, intersection, complement. Venn diagrams. Sample spaces and events.
- The idea of measure of a set. Counting measure. Properties of measures. Probability as measure.
- Calculating probabilities in practice - use of symmetry, relative frequencies, subjective probability.
- Joint and conditional probability, Bayes theorem, prior and posterior probabilities. Independence.
- Discrete random variables. Cumulative distributions and probability laws/ mass functions.
- Expectation and variance and their properties (e.g. E(X+Y)=E(X)+E(Y), E(aX+b)=aE(X)+b, Var(aX+b)=a2Var(X).)
- Bernoulli, binomial, Poisson and geometric random variables. Calculations of laws, means and variances. The Poisson distribution as the limit of a binomial. The binomial and Poisson distribution in R.
- Multivariate discrete random variables. Covariance and correlation between two discrete random variables. The multinomial distribution.
- Area under a curve as a measure. Probability via integration. Continuous random variables and their pdfs.
- Examples. Uniform and exponential distributions.
- Mean and variance as integrals.
- The normal distribution. The normal distribution in R. The standard normal Z. Mean and variance in general case via X = σZ + μ.
|B||Applebaum||Probability and information : an integrated approach (2nd ed)||Blackwells||Amazon|
|B||Dekking, Kraaikamp, Lopuhaa and Meester||A modern introduction to probability and statistics: understanding why and how||Blackwells||Amazon|
|B||Grimmett and Welsh||Probability : an introduction||Blackwells||Amazon|
|B||Ross||A first course in probability (8th ed)||Blackwells||Amazon|
|C||Blastland and Dilnot||The tiger that isn't: seeing through a world of numbers||Blackwells||Amazon|
|C||Schoenberg||Introduction to probability with Texas hold'em examples||Blackwells||Amazon|
|C||Silver||The Signal and the Noise: The Art and Science of Prediction||Blackwells||Amazon|
(A = essential, B = recommended, C = background.)
Most books on reading lists should also be available from the Blackwells shop at Jessop West.