
Abstract: Legitimate privacy concerns often prevent us from making use of distributed data stored in protected silos. How can we enable practitioners to perform advanced analytics on this sensitive (siloed) data whilst safeguarding and not compromising the original data points? This tech talk explores one of the potential solutions—split learning, a new method for training a modular deep neural network where each module lives in a data silo while upholding quantifiable standards for privacy and security. We will also dive into the privacy implications of training and releasing the model, including common privacy attacks and general use cases for federated analytics.
Bio: Grzegorz (Greg) Gawron, MScEng, is a Staff Software Engineer at LiveRamp. He is focused on practical applications of differential privacy and other privacy-enhancing technologies, continuing his work from DataFleets, which was acquired by LiveRamp in February 2021. During his recent engagement at the IQSS/Harvard OpenDP Fellows program, he explored the privacy implications of split learning. He obtained his Master's degree in computer science and has a Master's degree in economics from Warsaw University in Poland. He is currently working toward a computer science PhD at AGH University of Science and Technology in Krakow.

Grzegorz Gawron
Title
Senior Staff Software Engineer | LiveRamp
