This seminar should make the students familiar with the current research trends in machine translation using deep neural networks. The students should most importantly learn how to deal with the ever-growing body of literature on empirical research in machine translation and critically asses its content. The semester consists of few lectures summarizing the state of the art, discussions on reading assignments and student presentation of selected papers.
The course is held on Wednesdays at 14:00 in S1. The first lecture is on February 26.
Reading 1.5 hour LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. Deep learning. Nature 521.7553 (2015): 436.
Mar 4 Sequence-to-Sequence
Covered topics: embeddings, RNNs, vanishing gradient, LSTM, encoder-decoder, attention
Reading 2 hours
Vaswani, Ashish, et al. Attention is all you need. Advances in Neural Information Processing Systems. 2017.
There will be a reading assignment after every class. You will be given few question about the reading that you should submit before the next lecture.
Students will present one of the selected groups of papers to their fellow students. The presenting students will also prepare questions for discussion after the paper presentation.
Others should also get familiar with the paper so they can participate in the discussion.
It is strongly encouraged to arrange a consultation with the course instructors at least one day before the presentation.
There will be a final written test that will not be graded.