Abstract: The workhorse of many Machine Learning and Deep Learning applications is gradient descent. There are many different flavors and implementations of gradient descent algorithms, but they all serve the same goal: optimize the output of a model given an objective function. Implementations exist in popular libraries and APIs that data scientists and researchers use daily, such as PyTorch, Sklearn, and Tensorflow.
Considering there are many implementations in Python, we often take these algorithms for granted. In this session, we’ll walk through the fundamentals of gradient descent algorithms, supervised machine learning, and work on fundamentals of building objects in Python. Even if you never implement your own gradient descent algorithm, knowing the foundational tools and techniques to create one is critical to being a successful data scientist.
By the end of this session, you’ll be able to construct classes in Python that function as basic machine learning models using various gradient descent algorithms.
Bio: Nico Van de Bovenkamp is a Senior Data Scientist at Nielsen, a leading market research company that measures media consumption. He works primarily on developing and deploying machine learning and deep learning systems for Advanced TV and Connected TV measurement. Nico is also a lead data science instructor at General Assembly, a global tech education company. In addition to teaching, Nico serves on General Assembly’s Data Science Product Advisory Board and is responsible for curating and managing their global data science curriculum.
Nico Van de Bovenkamp
ETP Data Scientist, Instructor | Nielsen, General Assembly
beginner-w19 | kickstarter-w19 | trainings-w19