Abstract: Large Language Models like GPT-4 are transforming the world in general and the field of data science in particular at an unprecedented pace. This training introduces deep learning transformer architectures including LLMs. Critically, it also demonstrates the breadth of capabilities of state-of-the-art LLMs like GPT-4 can deliver, including for dramatically revolutionizing the development of machine learning models and commercially successful data-driven products, accelerating the creative capacities of data scientists and pushing them in the direction of being data product managers. Brought to life via hands-on code demos that leverage the Hugging Face and PyTorch Lightning Python libraries, this training covers the full lifecycle of LLM development, from training to production deployment.
Module 1: Introduction to Large Language Models
- Transformer Architectures
Module 2: The Breadth of LLM Capabilities
- OpenAI APIs, including GPT-4
Module 3: Training and Deploying LLMs
- Hugging Face models
- Training with PyTorch Lightning
- Streaming data sets
- Deployment considerations
- Parameter-efficient fine-tuning (PEFT) with low-rank adaptation (LoRA)
- Single-GPU models: LLaMA, Alpaca, GPT4All, Vicuña, and Dolly 2.0
- Multiple GPUs
Module 4: Getting Commercial Value from LLMs
- Tasks that can be Automated
- Tasks that can be Augmented
- Guidance for Successful A.I. Teams and Projects
Parts of this training will be accessible to anyone who would like to understand how to develop commercially-successful data products in the new paradigm unleashed by LLMs like GPT-4. To make the most of this training, attendees should be proficient in deep learning and Python programming.
Bio: Jon Krohn is Co-Founder and Chief Data Scientist at the machine learning company Nebula. He authored the book Deep Learning Illustrated, an instant #1 bestseller that was translated into seven languages. He is also the host of SuperDataScience, the data science industry’s most listened-to podcast. Jon is renowned for his compelling lectures, which he offers at leading universities and conferences, as well as via his award-winning YouTube channel. He holds a PhD from Oxford and has been publishing on machine learning in prominent academic journals since 2010.