Each lecture comes with a list of recommended problems to be solved as shown in the table below. If there are no letters in front of the numbers, they refer to problems in the book by Bishop. If the letters HTF appears in front of the number that means that exercise is to be found in the book by Hastie, Tibshirani and Friedman.
Note that the slides provided below only covers a small part of the lectures, the whiteboard is used quite extensively.
The schedule is available here.
- [pdf] Introduction (Chap. 1-2 and notes). Problems: 2.13, 2.29, 2.32, 2.34, 2.40, 2.44, 2.47.
- [pdf] Linear regression (Chap. 3, HTF Chap. 3). Problems: 1.25, 1.26, 3.8, 3.9, 3.12, 3.13.
- [pdf] Linear classification (Chap. 4). Problems: 4.5, 4.19, 4.25, HTF:2.8.
- [pdf] Neural networks, kernel methods intro. (Chap. 5-6.3). Problems: 5.4, 5.16, 6.3, HTF: 11.5.
- [pdf] Kernel methods (Chap. 6.4-7). Problems: Available here, m-file.
- [pdf] EM and clustering (Chap. 9 and Notes). Problems: 9.8, 9.9, 9.11, 12.24 (also in Matlab, see lecture 1).
- [pdf] Approximate inference (Chap. 10, Notes, and Code). Problems: 10.4, 10.7, 10.26, 10.38.
- [pdf] Graphical models (Chap. 8). Problems: 14.6, 14.7, 8.1, 8.3, 8.4, 8.7.
- [pdf] Graphical models and message passing (Chap. 8, code). Problems: 8.10, 8.11, 8.19, 8.23, 8.27.
- [pdf] MCMC and sampling methods (Chap. 11 and Code). Problems: Available here, m-file
- [pdf] Bayesian nonparametric models (P1, P2, P3 and Code).