News

2013-06-25 The students' seminar work is available at https://github.com/bayesian-inference.
2013-04-22 Video recordings of the lectures are available at http://www.youtube.com/playlist?list=PLrM7Z8xNORRdvGS6qEkbNmXavtutAEEeG. All recorded slides are available at http://www.youtube.com/playlist?list=PLrM7Z8xNORRdiuN6TjWDAfh4g633XdWGl.
2013-04-22 Dear students, based on the last doodle, we will meet on Friday the 26th at 9:00am.  I have decided that the most important is to go through the examples presented during the guest lectures. As a result, I believe that each of you should present at least one solution to a example on a whiteboard in front of the rest of the colleagues. The presentation will include description of the problem, detailed derivation and presentation of the implementation of the problem on computer, ideally implemented in Python, without using high level libraries such as PyGP, or PyMC. NumPy or SciPy is fine. Details will be discussed this Friday. There is the list of examples and each of you can sign-up for one or more examples (alternatives) you are interested in solving. 
 
1. L1: TrueSkill model as decribed by Jose + inference (must be found how to implement this)
2. L2: Message passing on a clique tree
3. L3: Laplace approximation: The probit regression model
4. L?: Variational inference: The probit regression model (not showed)
5. L3: Variational inference: 2D Ising Model
6. L3: Variational inference: Unknown mean and variance of a Gaussian with improper priors.
7. L3: Variational inference - Local method : Logistic regression 
8. L4: Expectation Propagation: The Clutter Problem
9. L?: Variational inference: The Clutter Problem (not showed)
10. L4: Expectation Propagation: The probit regression model
11. L5: Bayesian inference for regression problem as on slides 4-11 + modified version with unknown $\sigma^2$
12. L5: Gaussian Processes: Sampling from a GP, inference GP with prior m(x) = 0 and k(x,x')  = {squared exponentional, ration qudratic}, analysis of the impact of different covariance functions on the resulting approximations.
13. L6: Gibbs Sampling: Probit regression
14: L7: MH Random walk: Logit regression
15: L7: Gibbs Sampling: Probit regression in multiclass setting
16: L7: Gibbs Sampling: Probit regression + spike and slab prior
17: L8: Dirichlet Process: Mixture model on a real line using marginal MCMC inference.
 
We will determine exactly who will present which example on the Friday's meeting. We will start with the presentations the next week.
 
2013-04-19-Added photos collected during lectures.
2013-04-18 Uploaded the final version of slides for the seventh and eight lecture. Uploaded the slides recording for the remaining lectures. The video recording of the lectures will be post-edited and  later posted on this web page. I would like to thank to our guest speakers for their wonderful talks and the insight into Bayesian machine learning they shared with us. Now, when the guest lectures finished, we have to agree on how to proceed further. I prepared a doodle to determine the next suitable lecture date and time: removed. Please tell me when you are available.
2013-04-16 Uploaded the final version of slides for the sixth lecture.
2013-04-15 Uploaded the final version of slides for the fifth lecture.
2013-04-12 Uploaded a preliminary version  of the last lecture on the topic of Dirichlet Processes. 
2013-04-11 Please note that there is also an online course called Probabilistic Graphical Models available at https://www.coursera.org/course/pgm. This course can help you to understand some of the topics presented in this course. Peter Cerno, thanks for bringing this up.
2013-04-11 Uploaded the final version of slides for the fourth lecture. Today's talk is about Expectation Propagation. 
2013-04-10 Uploaded the final version of the third lecture slides and preliminary slides for the lectures 5-7. 
2013-04-09 Uploaded the final version of the second lecture slides. 
2013-04-08 The guest lectures started. Uploaded final version of the first lecture slides and the accompanying exercise. 
2013-04-05 Added lecture topics and slides for the first four lectures. The first four lectures will be presented by Jose Miguel Hernandez-Lobato.
2013-04-04 The guest lectures starts the next week on Monday at 9:00am at the MS S1 lecture room.
2013-02-28 Added lectures dates and times.
2013-02-28 An introductory series of video lectures on Bayesian inference can be found at http://videolectures.net/mlss2012_lapalma/ .
2013-02-19 I will organise an informal introduction for the prospective students of this course on this Friday (2013-02-22) at 2pm in the S1 lecture room. I will discuss the goals and content of the course and how it complements other courses taught at MFF. All are welcome.
2013-02-15 Even if you do not want to officially participate in the course, you are still welcome. Also, you can attend only the lectures presented by the experts from Cambridge if it suits you better. Just to make allocation of lecture rooms easier, could you please respond to the following doodle: http://doodle.com/vu6ttns2ugt3fnk8 ?
2013-02-14 The course is in SIS as NPFL108 - Bayesian inference so that you can sign up.
2013-02-07 Literature update. Added a link to the 4F13 Machine Learning course taught by C. Rasmussen and Z. Ghahramani at Cambridge.
2013-02-04 Syllabus update.
2013-02-04 Sara Wade will come instead of Richard Turner due to the Richard's busy schedule.
2013-02-04 The start date of the lectures was set to the week beginning on 8.4.2013.

Bayesian inference

The course aims to provide students with basic understanding of modern Bayesian inference methods. The course will emphasize and discuss methods which have application in robotics, natural language processing, data mining, web search.

The course will be composed of a series of lectures presented by experts from the Machine Learning Group, Cambridge University, UK, led by Prof. Zoubin Ghahramani. The presenters will be José Miguel Hernández Lobato and Sara Wade. The guests will present 8 lectures during two weeks. The exact dates of the lectures will be specified later. The lectures will begin with the week staring on 8.4.2013. After these two intensive weeks, there will several practicals (laboratory exercise) which should help the students implement the some of the presented methods. The course will be close by an examination. Before examination students must submit solved examples form practicals.

The course will be presented in English and it will be based on the the machine learning course 4F13 taught by Carl Edward Rasmussen and Zoubin Ghahramani at Cambridge University Engineering Department.  The following link includes slides from this course as well as the practicals that the students are expected to do: http://mlg.eng.cam.ac.uk/teaching/4f13/1213/

Lectures dates and times

There will be 8 lectures. The lectures are planned for early morning so that they do not conflict with your other activities.
  1. 8.4.2013 room MS S1 from 9:00 till 10:30      An Introduction to Bayesian Machine Learning (Jose Miguel Hernandez-Lobato), slidesexercise
  2. 9.4.2013 room MS S1 from 9:00 till 10:30      Inference in Graphical Models (Jose Miguel Hernandez-Lobato), slides
  3. 10.4.2013 room MS S1 from 9:00 till 10:30     The Laplace Approximation and Variational Inference (Jose Miguel Hernandez-Lobato), slides
  4. 11.4.2013 room MS S1 from 9:00 till 10:30     Expectation Propagation (Jose Miguel Hernandez-Lobato), slidesslides recording
  5. 15.4.2013 room MS S1 from 9:00 till 10:30     Bayesian Regression and Gaussian Processes (Sara Wade), slidesslides recording
  6. 16.4.2013 room MS S1 from 9:00 till 10:30     Sampling Methods (Sara Wade), slidesslides recording
  7. 17.4.2013 room MS S1 from 9:00 till 10:30     Sampling Methods - Logit regression example, other examples (Sara Wade), slideslogit example slides recording
  8. 18.4.2013 room MS S1 from 9:00 till 10:30     Dirichlet processes (Sara Wade), slidesslides recording

Video recordings at YouTube: http://www.youtube.com/playlist?list=PLrM7Z8xNORRdvGS6qEkbNmXavtutAEEeG

Slides recordings at YouTube: http://www.youtube.com/playlist?list=PLrM7Z8xNORRdiuN6TjWDAfh4g633XdWGl

Syllabus

Lecture topics:
  • Introduction to Bayesian Machine learning and Bayesian networks.
  • Belief propagation and loopy belief propagation in Bayesian Networks.
  • Variational Bayes and expectation propagation.
  • Sampling methods.
  • Gaussian processes.
  • Dirichlet processes.

License

The slides, video recording, and photos are released under the Creative Commons Attribution–ShareAlike License 3.0 (CC-BY-SA 3.0).

Literature

The material covered in the lectures can be found in recent text books: [1] D. Koller, N. Friedman: Probabilistic Graphical Models: Principles and Techniques (Adaptive Computation and Machine Learning series), The MIT Press, 2009, p. 1280
[2] C. M. Bishop: Pattern Recognition and Machine Learning, vol. 4, no. 4. Springer, 2006, p. 738.
[3] K. Murphy: Machine Learning: a Probabilistic Perspective, the MIT Press (2012). 
[4] D. Barber: Bayesian Reasoning and Machine Learning, Cambridge University Press (2012), available freely on the web.
[5] D. MacKay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press (2003), available freely on the web at http://www.inference.phy.cam.ac.uk/mackay/itila/. It is also include video lectures.
[6] C. Rasmussen, Z. Ghahramani:  4F13 Machine Learning taught at Cambridge University Engineering Department. Slides and the practicals: http://mlg.eng.cam.ac.uk/teaching/4f13/1213/