SIS code: 
0/2 C

Compendium of Neural Machine Translation

Deep learning is getting everywhere including machine translation. The neural machine translation recently became a new interesting and successful paradigm and beated everything that existed before. The famous Google Translate recently throwed away their old system and deployed a new one based on neural networks. Neural systems ruled the machine translation competition last year in Berlin.

The new paradigm brings new theoretical concepts and new ways of seeing the classic problems of machine translation. In this this seminar, you will get familiar with the theoretical framework of neural machine translation in such depth that would allow them to study the most recent academic papers on this topic.

In the first six seminars, we will give lectures on the following topics:

  • recurrent neural network and the intuition behing them
  • sequence-to-sequence learning
  • attention models and their variants
  • how to deal with a limited vocabulary if there can be
  • advanced training methods like renforcement training

In the rest of the semester, you will present selected papers from the recent research conferences and academic journals on that topic and discuss how these could be implemented. At the end of the semester, you should be up-to-date with the most recent development of neural machine translation.

... if you can't wait for the beginning of the semester, you can read what New York Times wrote about it in December.