NPFL116 – Compendium of Neural Machine Translation

This seminar should make the students familiar with the current research trends in machine translation using deep neural networks. The students should most importantly learn how to deal with the ever-growing body of literature on empirical research in machine translation and critically asses its content. The semester consists of few lectures summarizing the state of the art, discussions on reading assignments and student presentation of selected papers.

About

SIS code: NPFL116
Semester: summer
E-credits: 3
Examination: 0/2 C
Instructors: Jindřich Helcl, Jindřich Libovický

Timespace Coordinates

The course is not taught this semester. Looking forward to see you next year!

Lectures

1. Introductory notes on machine translation and deep learning Logistics RNNs Reading Questions

2. Sequence-to-sequence learning using Recurrent Neural Networks Encoder-Decoder architecture Reading Questions

License

Unless otherwise stated, teaching materials for this course are available under CC BY-SA 4.0.

1. Introductory notes on machine translation and deep learning

 Feb 24 Logistics RNNs

Covered topics: what is MT, deep liearing, embeddings, RNNs, vanishing gradient, LSTM

Reading  1 hour Tao Lei et al., Simple Recurrent Units for Highly Parallelizable Recurrence. EMNLP 2018.

Questions

  • Identify the information highway in simple recurrent units.
  • Given that all input embeddings are known, what part of the SRU computation can be fully parallelized and what part of the computation is sequential?

2. Sequence-to-sequence learning using Recurrent Neural Networks

 Mar 3 Encoder-Decoder architecture

Covered topics: sequence-to-sequence learning with RNNs, attention

Reading  1 hour Kasai et al., Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation. ICLR 2021.

Questions

  • The motivation of using a shallow decoder is to be able to generate output sentences efficiently. Can you think of a more efficient way than using a shallow Transformer decoder?

Reading assignments

There will be a reading assignment after every class. You will be given few question about the reading that you should submit before the next lecture.

Student presentations

Students will present one of the selected groups of papers to their fellow students. The presenting students will also prepare questions for discussion after the paper presentation.

Others should also get familiar with the paper so they can participate in the discussion.

It is strongly encouraged to arrange a consultation with the course instructors at least one day before the presentation.

Final written test

There will be a final written test that will not be graded.