Abstract: The Julia language is known for combining performance, ease of use, and flexibility. In the past year, this foundation has continued to expand and Julia users have built on it to push the envelope in several areas of data science and scientific computing. This talk will explain some of these developments and how to take advantage of them. First, we introduced a new composable multi-threading system that makes it easy to take advantage of multiple processor cores in all Julia code and libraries. This has been used, for example, to multi-thread CSV parsing in CSV.jl (written in 100% Julia), bringing its performance on par with the best CSV readers. Julia is at the forefront of the emerging field of scientific machine learning, which combines data-driven learning approaches with traditional modeling techniques such as differential equations. Julia packages now exist for applying scientific ML to a range of real-world problems.
Bio: Jeff is one of the creators of Julia, co-founding the project at MIT in 2009 and eventually receiving a Ph.D. related to the language in 2015. He continues to work on the compiler and system internals, while also working to expand Julia’s commercial reach as a co-founder of Julia Computing, Inc.