Skip to main content

Unit information: Statistical Methods 2 in 2019/20

Please note: Due to alternative arrangements for teaching and assessment in place from 18 March 2020 to mitigate against the restrictions in place due to COVID-19, information shown for 2019/20 may not always be accurate.

Please note: you are viewing unit and programme information for a past academic year. Please see the current academic year for up to date information.

Unit name Statistical Methods 2
Unit code MATHM0038
Credit points 20
Level of study M/7
Teaching block(s) Teaching Block 2 (weeks 13 - 24)
Unit director Dr. Gerber
Open unit status Not open
Pre-requisites

Statistical Methods 1 and Statistical Computing 1

Co-requisites

None

School/department School of Mathematics
Faculty Faculty of Science

Description including Unit Aims

This unit complements Statistical Modelling 1 (prerequisite) with additional material on new topics, such as generative statistical models (discriminative models were the focus of Statistical Modelling 1), and with more detailed coverage of core statistical techniques: penalization methods and sparsity, approximate and fully-Bayesian inference, and additive modelling.

Intended Learning Outcomes

By the end of the unit students should be able to:

  • Distinguish between discriminative and generative statistical models, and apply generative modelling approaches to tasks including classification, clustering, dimensional reduction and data compression, and missing data imputation.
  • Describe penalized likelihood approaches to model-fitting and prediction, and implement them for inference and prediction using state-of-the-art packages in R.
  • Perform numerical optimizations using standard algorithms, and be able to write bespoke optimizers for functions with particular properties.
  • Explain the motivation and challenges of a ‘fully Bayesian’ approach to statistical inference and prediction, and the way in which Markov Chain Monte Carlo techniques can be used to implement a Bayesian approach.
  • Formulate a Bayesian hierarchical model, implement it in specialized software, and be able to perform convergence assessment and code validation.
  • Describe and implement additive modelling approaches, including strategies for specifying control parameters, and approximation methods for very large datasets.

Teaching Information

Some lab based instruction as mentioned above in details

Assessment Information

Formative: a homework each week

Summative:

  1. A personal portfolio of notes, code snippets, and vignettes, 30%.
  2. Assessed coursework, 2 at 20% each.
  3. A group project, 30%.

Reading and References

T. Hastie, R. Tibshirani, and J. Friedman (2017), The Elements of Statistical Learning, 2nd edition, Springer

Feedback