
Abstract: In this session, I'll start by introducing the recent breakthroughs in NLP that resulted from the combination of Transfer Learning and Transformer architectures. Then, we'll learn to use the open-source tools released by HuggingFace like the Transformers and Tokenizers libraries and the distilled models.
Learning outcomes: understanding Transfer Learning in NLP, how the Transformers and Tokenizers libraries are organized and how to use them for downstream tasks like text classification, NER and text generation.
Bio: Thomas leads the Science Team at Huggingface Inc., a Brooklyn-based startup working on Natural Language Generation and Natural Language Understanding.
After graduating from Ecole Polytechnique (Paris, France), he worked on laser-plasma interactions at the BELLA Center of the Lawrence Berkeley National Laboratory (Berkeley, CA). Got accepted for a PhD at MIT (Cambridge, MA) but ended up doing his PhD in Statistical/Quantum physics at Sorbonne University and ESPCI (Paris, France), working on superconducting materials for the French DARPA (DGA) and Thales.
Thomas is interested in Natural Language Processing, Deep Learning, and Computational Linguistics. Much of his research is about Natural Language Generation (mostly) and Natural Language Understanding (as a tool for better generation).

Thomas Wolf, PhD
Title
Chief Science Officer | Hugging Face 🤗
