Session Details

ML Summit
The big training event for Machine Learning
October 11 - 13, 2021 | Online April 2022 | Munich

Christoph Henkelmann


Bis zum 18. Februar anmelden und bis zu 200 € pro Ticket sparen! Jetzt anmelden
Early Bird:
Save up to
€ 200 per ticket!
Secure your ticket now
June 29 – July 1
Early Bird
Register now and save up to € 200!
Secure your ticket now

Efficient Transformers

Transformers are the new go-to technology for Natural Language Processing (NLP) and are also starting to gain traction in the computer vision community. However, despite all their successes and widespread adoption, they have one major drawback: Their computation and memory requirements grow quadratically with the input size. Hence training transformer models from scratch is a very resource-intensive task.

In this session we want to take a look at the current state of the research into efficient transformer layers, i.e. reformulations of the vanilla transformers that have computation and/or memory requirements of O(n*log(n)) or even O(n). If your knowledge about transformers or complexity theory is a bit rusty, do not worry: The session will start with a short refresher on both topics so you can make the most of it.

Session Tracks

#ML Conference