Abstract: Much real-world data such as medical records, customer interactions, or financial transactions is recorded at irregular time intervals. However, most deep learning time series models, such as recurrent neural networks, require data to be recorded at regular intervals, such hourly or daily. I'll explain some recent advances in building deep stochastic differential equation models that specify continuous-time dynamics. This allows us to fit a new family of richly-parameterized distributions over time series. We'll discuss their strengths and limitations, and demonstrate these models on medical records and motion capture data. All the tools discussed are open-source.
Bio: David Duvenaud is an assistant professor in computer science and statistics at the University of Toronto, where he holds a Canada Research Chair in generative models. His postdoctoral research was done at Harvard University, where he worked on hyperparameter optimization, variational inference, and automatic chemical design. He did his Ph.D. at the University of Cambridge, studying Bayesian nonparametrics with Zoubin Ghahramani and Carl Rasmussen. David spent two summers in the machine vision team at Google Research, and also co-founded Invenia, an energy forecasting and trading company. David is a founding member of the Vector Institute for Artificial Intelligence.