
Abstract: In this workshop, I would be talking about transformers and their inner workings. I would in fact review two google papers:
- Presentation: Attention is what you need.
- NB: Create Translator from Scratch.
- Presentation: BERT Transformers.
- NB: How to Use BERT for a task. like QA or Classification.
Session Outline
In this workshop I would be talking about transformers and their inner workings. I would in fact review two google papers:
- Module 1: What are transformers?
Presentation: Attention is what you need.
NB: Create Translator from Scratch.
- Module 2: What is BERT?
Presentation: BERT Transformers
NB: How to Use BERT for a task like Question answering or Classification.
I would need more time to structure this more.
Background Knowledge
Basic NLP,
Deep Learning,
LSTM
Bio: Rahul Agarwal is currently working as a Machine Learning Engineer at Facebook. Before this he has worked at WalmartLabs and Citi India.
In his free time Rahul likes to write about Data Science and Machine Learning and you could find him at Medium, his own Website, or linkedin profile which is the place where he is the most active.