Deep Learning Seminar, Winter 2019/20

In recent years, deep neural networks have been used to solve complex machine-learning problems and have achieved significant state-of-the-art results in many areas. The whole field of deep learning has been developing rapidly, with new methods and techniques emerging steadily.

The goal of the seminar is to follow the newest advancements in the deep learning field. The course takes form of a reading group – each lecture a paper is presented by one of the students. The paper is announced in advance, hence all participants can read it beforehand and can take part in the discussion of the paper.

If you want to receive announcements about chosen paper, sign up to our mailing list ufal-rg@googlegroups.com.

About

SIS code: NPFL117
Semester: winter + summer
E-credits: 3
Examination: 0/2 C
Guarantor: Milan Straka

Timespace Coordinates

The Deep Learning Seminar takes place on Monday at 12:20 in S10. We will first meet on Monday Oct 07.

Requirements

To pass the course, you need to present a research paper and sufficiently attend the presentations.

If you want to receive announcements about chosen paper, sign up to our mailing list ufal-rg@googlegroups.com.

To add your name to a paper the table below, edit the source code on GitHub and send a PR.

Date Who Topic Paper(s)
07 Oct 2019 Milan Straka Transformer Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin: Attention Is All You Need
Peter Shaw, Jakob Uszkoreit, Ashish Vaswani: Self-Attention with Relative Position Representations
Cheng-Zhi Anna Huang, Ashish Vaswani, Jakob Uszkoreit, Noam Shazeer, Ian Simon, Curtis Hawthorne, Andrew M. Dai, Matthew D. Hoffman, Monica Dinculescu, Douglas Eck: Music Transformer
Zihang Dai, Zhilin Yang, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov: Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
14 Oct 2019 Ondřej Měkota Transformer Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding
Zhilin Yang, Zihang Dai, Yiming Yang, Jaime Carbonell, Ruslan Salakhutdinov, Quoc V. Le. XLNet: Generalized Autoregressive Pretraining for Language Understanding
21 Oct 2019 Tomas Soucek 3D Pointclouds Charles R. Qi, Li Yi, Hao Su, Leonidas J. Guibas: PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space
Christopher Choy, JunYoung Gwak, Silvio Savarese: 4D Spatio-Temporal ConvNets: Minkowski Convolutional Neural Networks
Christopher Choy, Jaesik Park, Vladlen Koltun: Fully Convolutional Geometric Features
28 Oct 2019 No DL seminar Czech Independence Day
04 Nov 2019 Zdeněk Kasner Neural LMs Radford, Alec, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever. Language Models are Unsupervised Multitask Learners (OpenAI blog post)
Subramanian, Sandeep, Raymond Li, Jonathan Pilault, and Christopher Pal. On Extractive and Abstractive Neural Document Summarization with Transformer Language Models
11 Nov 2019
18 Nov 2019
25 Nov 2019
02 Dec 2019
09 Dec 2019
16 Dec 2019
23 Dec 2019 No DL seminar Christmas Holiday
30 Dec 2019 No DL seminar Christmas Holiday
06 Jan 2020

You can choose any paper you find interesting, but if you would like some inspiration, you can look at the following list. The papers are grouped, each group is expected to be presented on one seminar.

Natural Language Processing

Generative Modeling

Neural Architecture Search (AutoML)

Networks with External Memory

Optimization

Adversarial Examples