Oct 28 – 30, 2024
Toulouse
Europe/Paris timezone

Automatic Differentiation - Julia's most confusing superpower?

Oct 29, 2024, 2:00 PM
1h
Amphi A001 (Toulouse)

Amphi A001

Toulouse

INP-ENSEEIHT, 2 Rue Charles Camichel 31071 Toulouse
Talk

Speaker

Guillaume Dalle (EPFL)

Description

It's 2024, are you still computing gradients by hand? Shame on you.
Thanks to Automatic Differentiation (AD), you can build awesome optimization or machine learning algorithms and let the computer worry about derivatives on its own. Well, mostly.

In Python, life is simple: choose an AD framework first (PyTorch or JAX), then write some code in that framework, and it should work out okay.
In Julia, life is fun: write the code first, then pick an AD package to differentiate it (ForwardDiff, Zygote, Enzyme, etc.). This paradigm is both more flexible and more dangerous, because not all code can be differentiated by every tool.

Julia's diverse AD ecosystem is often daunting for newcomers. In this talk, I will try to bring some clarity by explaining

  • How AD works under the hood
  • Which tools to use it in your daily work
  • What to do if things go sideways

Primary author

Presentation materials