Abstract: Over the last 18 months or so, Deep Learning techniques in NLP have undergone a dramatic transformation, thanks to the new Transformer style network architectures. This has made Conversational AI much more accessible to Data Scientists than ever before. This talk will be a quick tour of this rapidly evolving landscape.
It will cover the evolution of Neural Network based approaches to NLP, starting with word embeddings, RNNs, Attention based mechanism and finally the Transformer network, that has brought about what's considered the ""ImageNet moment of NLP"".
It'll explore the different Transformer based network architectures including BERT, XLNet and GPT-2. It'll also cover the reasons that are driving the trend of ever larger pre-trained language models built using every larger datasets, and the tools and techniques to compress and distill these models for efficient inference.
Bio: Raghav Mani manages Developer Relations at NVIDIA, with a focus on NLP, Speech and Data Science in Healthcare.
Prior to NVIDIA, Raghav worked at Epic leading different product & engineering teams, including Epic's patient engagement platform called MyChart.
Most recently, he led a team of data scientists and engineers in building Epic’s ML platform on Azure.
He's a strong advocate for the use of Deep Learning & NLP techniques in Healthcare.
He holds a Bachelor’s degree from Indian Institute of Technology, Madras and a Masters from Texas A&M University, College Station.