Abstract: Being specialized in domains like computer vision and natural language processing is no longer a luxury but a necessity which is expected of any data scientist in today’s fast-paced world! With a hands-on and interactive approach, we will understand essential concepts in deep transfer learning for NLP with a focus on Transformers and Large Language Models. We will compare and contrast transformers like BERT and LLMs like ChatGPT. Finally we will also showcase with hands-on tutorials how to solve popular tasks using NLP including NER, Classification, Search / Information Retrieval, Summarization, Classification, Language Translation, Q&A systems using models like BERT and ChatGPT and popular libraries like HuggingFace, OpenAI and the Python programming language.
Bio: Dipanjan (DJ) Sarkar is an acknowledged Data Scientist, published Author and Consultant with over nine years of industry experience in all things data. He was recognized as a Google Developer Expert in Machine Learning by Google in 2019, and a Champion Innovator in Cloud AI\ML by Google in 2022. He currently works as a Lead Data Scientist at Constructor Learning (formerly Schaffhausen Institute of Technology (SIT) Learning), Zurich.
Dipanjan has led advanced analytics initiatives working with Fortune 500 companies like Intel, Applied Materials, Red Hat / IBM. He works on leveraging data science, machine learning and deep learning to build large- scale intelligent systems. Dipanjan also works as an independent consultant, mentor and AI advisor in his spare time collaborating with multiple universities, organizations and startups across the globe. His passion includes solving challenging data problems as well as educating and helping people upskill in all things data. Find more about him at https://djsarkar.com