Introduction to Machine Learning

This page is slowly decomposing, as material gets moved to the new webpage it will be deleted. Eventually it will disappear.

Timetable, lecture slides and recommended text books.

Project Info

Link 2020 to the project description.

Timetable 2020

Timetable
Lecture Date Topic and Slides Links to additional material
1 23/1 Introduction. Slides Read chapter 1 of any (if not all) the recommended books linked from above. See also the papers: Scaling to very very large corpora for natural language disambiguation, The Unreasonable Effectiveness of Data Python notebook used in the lectures to generate figures or view online.  
2 28/1 Linear Regression. Slides Chapter 1 of A First Course in Machine Learning Available online at the Library Section 3.5 of 9.1,9.2 Machine Learning An Algorithmic Perspective Available online at the university library. Search on youtube for Andrew Ng's machine learning lectures. There quite a number of useful lectures on regression.
3 30/1 Probability, Bayes theorem, and Naive Bayes classification. Slides Section 2.3 of Machine Learning An Algorithmic Perspective (UB) Chapter 2, and section 5.2 of A First Course in Machine Learning (UB) Two useful videos (1 and 2) on Gaussian priors.
Lab 3/2 Lab instructions  
4 6/2 Logistic regression. Slides Section 5.2.2 of A First Course in Machine Learning (UB) Chapter 5 of Machine Learning Algorithms (Bonaccorso, Giuseppe) online at (UB)
  10/2 Lecture Cancelled  
5 13/2 Support Vector Machines Slides There are lots of resources on support vector machines. See for example: 5.3.2. of Section 5.2.2 of A First Course in Machine Learning (UB), Chapter 8 of Machine Learning An Algorithmic Perspective the (UB), Chapter 7 of Machine Learning Algorithms (Bonaccorso, Giuseppe) online at (UB). Other useful tutorials data-flair Towards data science and MonkeyLearn Blog For an in depth tutorial (that covers the mathematics behind it) see this article.
7 20/2 Cross validation and a little bit of feature engineering. Slides Online snap shot of the notebook used to generate the examples. The text book does not have that much material on cross validation, but you can look at chapter 7 (Model Assessment and Selection) of Elements of Statistical learning and the scikit learn documentation.
Lab 27/2 Lab Instructions  
3/3 Lecture Cancelled  
8 4/3 Decision Trees Notes See the excellent notes by Richard Johansson. Also look at chapter 8 of An Introduction to Statistical Learning with Applications in R available online via the University library. Also the paper Selected Algorithms of Machine Learning from Examples by Jerzy W. GRZYMALA-BUSSE contains several worked examples. I am only going to cover trees for classification not trees for regression. This youtube video goes though an example of calculating a decision tree in detail. Information theory started with one paper A Mathematical Theory of Communication by Claude E. Shannon. It is quite instructive to read the original paper.
9 6/3 Preprocessing, Principle component analysis. Lecture notes Chapter 6 of Machine Learning An Algorithmic Perspective and 7.1 and 7.2 of A First Course in Machine Learning (UB). In order to understand the derivation of PCA you need to understand how you differentiate matrix expressions. These slides are quite useful. For an interesting application of PCA look art Eigen faces.
10 12/3 Advice about the exam  
Justin Pearson