Abstract: Large-scale labeled training datasets have enabled deep neural networks to excel across a wide range of benchmark machine learning tasks. However, in many problems, it is prohibitively expensive and time-consuming to obtain large quantities of labeled data. This talk introduces my recent research on learning with less labels. I develop domain adaptation, low-shot learning, and self-supervised learning algorithms to transfer information through multiple domains and recognize novel categories with few-shot samples. My research enables the learning system to automatically adapt to real-world variations and new environmental conditions. Specifically, I will talk about adversarial multiple source domain adaptation, multi-source distilling domain adaptation, learning invariant risks and representations for domain transfer, compositional few-shot recognition with primitive discovery and enhancing, distant-domain few-shot recognition with mid-level patterns, and generalized zero-shot learning with dual adversarial networks.
Bio: Dr. Shanghang Zhang is a postdoctoral research fellow in the Berkeley AI Research (BAIR) Lab, the Department of Electrical Engineering and Computer Sciences, UC Berkeley, USA. She received her Ph.D. from Carnegie Mellon University in 2018. Her research interests cover deep learning, computer vision, and reinforcement learning, as reflected in her numerous publications in top-tier journals and conference proceedings, including NeurIPS, CVPR, ICCV, and AAAI. Her research mainly focuses on machine learning with limited training data, including low-shot learning, domain adaptation, and meta-learning, which enables the learning system to automatically adapt to real-world variations and new environments. She was one of the “2018 Rising Stars in EECS” (a highly selective program launched at MIT in 2012, which has since been hosted at UC Berkeley, Carnegie Mellon, and Stanford annually). She has also been selected for the Qualcomm Innovation Fellowship (QInF) Finalist Award and Chiang Chen Overseas Graduate Fellowship.