Week 3: Linear Algebra and Regression
For this week the lecture slides are available here.
YouTube Video
There is a YouTube video available of me giving this material at the Gaussian Process Road Show in Uganda.
You will need to watch this in HD to make the maths clearer.
Lab Class
Linear regression with numpy and Python.
The notebook for the lab class can be downloaded from here.
To obtain the lab class in ipython notebook, first open the ipython notebook. Then paste the following code into the ipython notebook
import urllib urllib.urlretrieve('https://raw.githubusercontent.com/SheffieldML/notebook/master/lab_classes/machine_learning/week3.ipynb', 'week3.ipynb')
You should now be able to find the lab class by clicking File->Open
on the ipython notebook menu.
Reading
- Reading (Regression)
- Sections 1.1-1.3 of Rogers and Girolami.
- Section 1.2.5 of Bishop up to Eq 1.65.
- Section 1.1 of Bishop.
- Reading (Linear Algebra, Matrix and Vector Review)
- Section 1.3 of Rogers and Girolami.
- Linear Algebra Guide
- Reading (Basis Functions)
- Chapter 1, pg 1-6 of Bishop.
- Section 1.4 of Rogers and Girolami.
- Chapter 3, Section 3.1 of Bishop up to pg 143.
Learning Outcomes Week 3
- Understand what the relation between sum of squares and a Gaussian likelihood.
- Understand a fixed stationary point of an objective is found through setting gradients to zero.
- Understand the algorithmic aproach of coordinate ascent.
- Understand linear algebraic approaches to representing the objective function, the prediction function.
- Perform vector differentiation of the sum of squares objective function function.
- Recover the stationary point of a multivariate linear regression sum of squares objective.
This document last modified Monday, 13-Oct-2014 15:51:11 UTC