Speaker
Description
It's 2024, are you still computing gradients by hand? Shame on you.
Thanks to Automatic Differentiation (AD), you can build awesome optimization or machine learning algorithms and let the computer worry about derivatives on its own. Well, mostly.
In Python, life is simple: choose an AD framework first (PyTorch or JAX), then write some code in that framework, and it should work out okay.
In Julia, life is fun: write the code first, then pick an AD package to differentiate it (ForwardDiff, Zygote, Enzyme, etc.). This paradigm is both more flexible and more dangerous, because not all code can be differentiated by every tool.
Julia's diverse AD ecosystem is often daunting for newcomers. In this talk, I will try to bring some clarity by explaining
- How AD works under the hood
- Which tools to use it in your daily work
- What to do if things go sideways