
Abstract: Flux (http://fluxml.ai) is a new machine learning library that’s easy and intuitive to use, but scales to handle the most difficult research challenges. As machine learning models grow increasingly complex, we suggest that neural networks are best viewed as an emerging, differentiable programming paradigm, and ask what decades of research into programming languages and compilers has to offer to the machine learning world.
Flux is written entirely in Julia, an easy but high-performance programming language similar to Python. You can train models using high-level Keras-like interfaces, or drop down to the mathematics, allowing complete customisation even down to the CUDA kernels. Meanwhile, Julia’s advanced compiler technology allows us to provide cutting edge performance.
This workshop will introduce Flux and its approach to building differentiable, trainable algorithms and show simple but practical examples in image recognition, reinforcement learning and natural language processing. We’ll also cover Flux’s ecosystem of existing ready-made models, and how these can be used to get a head start on real-world problems.
Bio: Avik has spent many years helping investment banks leverage technology in risk and capital markets. He’s worked on bringing AI powered solutions to investment research, and is currently the VP of Engineering at Julia Computing.

Avik Sengupta
Title
VP of Engineering at Julia Computing
Category
europe-2018-workshops
