Module overview
Machine Learning is about extracting useful information from large and complex datasets. The subject is a rich mixture of concepts from function analysis, statistical modelling and computational techniques. The module will cover the fundamental principles in the subject, where you will learn the theoretical basis of how learning algorithms are derived and when they are optimally applied, and gain some hands-on experience in laboratory-bases sessions. It will lead to more advanced material covered in later modules.
Exclusions: Cannot be taken with COMP3206 or COMP3222 or COMP3223 or COMP6229 or COMP6246.
Aims and Objectives
Learning Outcomes
Subject Specific Practical Skills
Having successfully completed this module you will be able to:
- Gain expertise using machine learning make predictions in a scientific computing environment
- Systematically work with data to learn new patterns or concepts
Subject Specific Intellectual and Research Skills
Having successfully completed this module you will be able to:
- Gain a critical appreciation of the latest research issues
- Characterise data in terms of explanatory models
- Derive learning algorithms from first principles and quantifying their performance
Knowledge and Understanding
Having successfully completed this module, you will be able to demonstrate knowledge and understanding of:
- The mathematical foundations of machine learning as inspired by biological learning
- Underlying mathematical principles from probability, linear algebra and optimisation
Syllabus
Historical Perspective
- Biological motivations: the McCulloch and Pitts neuron, Hebbian learning.
- Statistical motivations
Theory
- Generalisation: What is learning from data?
- The power of machine learning methods: What is a learning algorithm? What can they do?
Probability
- Probability as representation of uncertainty in models and data
- Bayes Theorem and its applications
- Law of large numbers and the Multivariate Gaussian distribution
Optimisation
- Convexity
- 1-D minimisation
- Gradient methods in higher dimensions
- Constrained optimisation
Linear Algebra
- Using matrices to find solutions of linear equations
- Properties of matrices and vector spaces
- Eigenvalues, eigenvectors and singular value decomposition
Supervised Learning
- Regression Analysis
- Classification using Bayesian principles
- Perceptron Learning
- Support Vector Machines and introduction to Kernel methods
- Neural networks/multi-layer perceptrons (MLP)
- Features and discriminant analysis
Data handling and unsupervised learning
- Principal Components Analysis (PCA)
- K-Means clustering
- Spectral clustering
Regression and Model-fitting Techniques
- Linear regression
- Polynomial Fitting
- Kernel Based Networks
Case Studies
- Example applications: Speech, Vision, Natural Language, Bioinformatics.
Learning and Teaching
Teaching and learning methods
Lectures, labs and guided self-study
Type | Hours |
---|---|
Follow-up work | 10 |
Lecture | 20 |
Revision | 10 |
Completion of assessment task | 18 |
Wider reading or practice | 76 |
Supervised time in studio/workshop | 6 |
Preparation for scheduled sessions | 10 |
Total study time | 150 |
Resources & Reading list
Textbooks
Mackay, David J. C.. Information Theory, Inference and Learning Algorithms..
Bishop, Christopher M.. Pattern Recognition and Machine Learning.
Assessment
Summative
This is how we’ll formally assess what you have learned in this module.
Method | Percentage contribution |
---|---|
Examination | 80% |
Coursework | 20% |
Referral
This is how we’ll assess you if you don’t meet the criteria to pass this module.
Method | Percentage contribution |
---|---|
Examination | 100% |
Repeat
An internal repeat is where you take all of your modules again, including any you passed. An external repeat is where you only re-take the modules you failed.
Method | Percentage contribution |
---|---|
Examination | 100% |
Repeat Information
Repeat type: Internal & External