Monday, 29 November, 2021 - 14:00
Room: 

Adapters in Transformers. A New Paradigm for Transfer Learning…?

Jonas Pfeiffer (Technical University of Darmstadt)

Adapters have recently been introduced as an alternative transfer learning strategy. Instead of fine-tuning all weights of a pre-trained transformer-based model, small neural network components are introduced at every layer. While the pre-trained parameters are frozen, only the newly introduced adapter weights are fine-tuned, achieving an encapsulation of the down-stream task information in designated parts of the model. In this talk we will provide an introduction to adapter-training in natural language processing. We will go into detail on how the encapsulated knowledge can be leveraged for compositional transfer learning, as well as cross-lingual transfer. We will further briefly touch the efficiency of adapters in terms of trainable parameters as well as (wall-clock) training time. Finally, we will provide an outlook to recent alternative adapter approaches and training strategies.

CV: 

Jonas is a 3rd year PhD student at the Ubiquitous Knowledge Processing lab at the Technical University of Darmstadt. He is interested in modular and compositional representation learning in multi-task, multilingual, and multi-modal contexts. Jonas has received the IBM PhD Research Fellowship award in 2020. He has given invited talks in academia (e.g. University of Cambridge, University of Colorado - Boulder), industry (e.g. Facebook AI Research, IBM Research), as well as at Machine Learning Summer/Winter Schools (e.g. Lisbon ML Summer School (LxMLS) 2021, Advanced Language Processing Winter School (ALPS) 2022).

 

***The talk will be streamed via Zoom. For details how to join the Zoom meeting, please write to sevcikova et ufal.mff.cuni.cz***