Abstract: It's easy to lose track of which changes gave you the best result when you start exploring multiple model architectures. Tracking the changes in your hyperparameter values, along with code and data changes, will help you build a more efficient model by giving you an exact reproduction of the conditions that made the model better.
In this workshop, you will learn how you can use the open-source tool, DVC, to compare increase reproducibility for two methods of tuning hyperparameters: grid search and random search. We'll go through a live demo of setting up and running grid search and random search experiments. By the end of the workshop, you'll know how to add reproducibility to your existing projects.
Knowledge of machine learning projects