Abstract: Cloud computing has successfully accommodated the three ""V""s of Big Data into data science, but collecting everything into the cloud is becoming increasingly infeasible. Today, we face a new set of challenges. A growing awareness of privacy among individual users and governing bodies is forcing platform providers to restrict the variety of data we can collect. Often, we cannot transfer data to the cloud at the velocity of its generation. Many cloud users suffer from sticker shock, buyer's remorse, or both as they try to keep up with the volume of data they must process. Making sense of data closer to its home is more appealing than ever.
Federated learning is a growing field that attempts to address this challenge by distributing learning and analytics tasks to end-user devices. Although theoretical federated learning research is growing exponentially to meet these challenges, we are far from putting those theories into practice. In this talk, I will introduce FedScale, a scalable and extensible open-source federated learning and analytics platform. It provides high-level APIs to implement algorithms, a modular design to customize implementations for diverse hardware and software backends, and the ease of deploying the same code at many scales. FedScale also includes a comprehensive benchmark that allows data scientists to evaluate their ideas in realistic, large-scale settings. I will highlight a select few systems successfully built using FedScale and share insights from benchmarking recent algorithms using FedScale.
Bio: Mosharaf Chowdhury is a Morris Wellman associate professor of CSE at the University of Michigan, Ann Arbor, where he leads the SymbioticLab. His work improves application performance and system efficiency of machine learning and big data workloads. He is also building software solutions to monitor and optimize the impact of machine learning systems on energy consumption and data privacy. His group developed Infiniswap, the first scalable software solution for memory disaggregation; Salus, the first software-only GPU sharing system for deep learning; FedScale, the largest federated learning benchmark and a scalable and extensible federated learning engine; and Zeus, the first GPU energy-vs-training performance tradeoff optimizer for DNN training. In the past, Mosharaf did seminal works on coflows and virtual network embedding, and he was a co-creator of Apache Spark. He has received many individual awards and fellowships, thanks to his stellar students and collaborators. His works have received seven paper awards from top venues, including NSDI, OSDI, and ATC, and over 22,000 citations. Mosharaf received his Ph.D. from UC Berkeley in 2015.