Selected Problems in Machine Learning
- Teachers: Zdeněk Žabokrtský, David Mareček
- Time and location: Tuesday 12:20–14.00 S7
Course focus
The course has been designed especially for PhD students with a deep interest in Machine Learning.
The course is a flexible combination of lectures, discussions, exercises and literature reading, aimed at
the following three topics:
- refreshing (and deeper understanding of) basic notions of Machine Learning
- introduction to Bayesian inference
- practising unsupervised ML (especially methods based on sampling)
Course prerequisities
Students are expected to be familiar with basic probabilistic and
ML concepts, roughly in the extent of NPFL067/068 - Statistical methods in NLP I/II, and
NPFL 054 - Introduction to Machine Learning (in NLP).
Course schedule
- "Calibration" test - let me know what you already know
- Patching the holes disclosed by the calibration test.
- Patching the holes disclosed by the calibration test, continued.
- Very studious excercise on Beta distribution.
- let us admire two mighty parameters generating a broad family of different shapes
- generalization to n-dimenzions - Dirichlet distribution
- supplementary materials - mathematicalmonks's videos:
- Derivation of some simple Bayesian models - let's enjoy conjugacy!
- Beta-Bernoulli model
- Dirichlet-Categorical model
- supplementary materials - mathematicalmonks's videos:
- Assignment 1 - Word-alignment using Gibbs sampling
- Reading - Bayesian Inference
- Kernel methods
- Assignment 2 - Segmentation of dependency trees
- Gibbs sampling in NLP - two case studies
Other useful links
Course passing requirements
All students are required to actively participate in the classes.