
Abstract: In this workshop, you'll walk through a complete end-to-end example of using Hugging Face Transformers, involving both our open-source libraries and some of our commercial products. Starting from a dataset containing real-life product reviews from Amazon.com, you'll train and deploy a text classification model predicting the star rating for similar reviews.
Session Outline:
Along the way, you'll learn how to:
- Explore models and datasets on the Hugging Face Hub,
- Load, prepare and save datasets with the Hugging Face datasets library,
- Load, train and save models with the Hugging Face transformers library,
- Build ML applications with Hugging Spaces to showcase your models,
- Use hardware acceleration with the Hugging Face Optimum library to optimize training and prediction times,
- and maybe a few more things, if we have time!
Of course, all code will be shared with you, and you'll be able to use it easily in your own projects.
Background Knowledge:
Participants don't need to be ML experts, but they must be familiar with basic ML concepts and workflows, as well as Python and Python-based tools for ML (Jupyter, numpy, pandas, etc.).
Bio: Julien is currently Chief Evangelist at Hugging Face. He's recently spent 6 years at Amazon Web Services where he was the Global Technical Evangelist for AI & Machine Learning. Prior to joining AWS, Julien served for 10 years as CTO/VP Engineering in large-scale startups.