Flux.jl stands out as a premier Julia package for deep learning, enveloping a wide array of recognized deep learning techniques such as convolution, attention mechanisms, and recurrent layers, to name a few. This session will probe into the foundational mechanisms of Flux.jl, emphasizing its three critical elements:
Throughout the discussion, we will scrutinize how some of the built-in layers are structured and concurrently learn the process of formulating custom ones. Additionally, a significant part of the presentation will be dedicated to exploring datasets, particularly focusing on how to integrate and utilize datasets from Python and R to augment the deep learning processes in Flux.jl.
Finally, we will analyze several well-known test cases such as MLP for approximation issues, CNN for image classification objectives, and Transformers for challenges related to text comprehension and translation.
Cette session est accessible à tous et aura lieu sur la plateforme BBB de Mathrice.
https://greenlight.virtualdata.cloud.math.cnrs.fr/b/pie-ed2-tke
Merci de bien vouloir vous inscrire pour suivre cette session. Cette session sera enregistrée. S'inscrire implique d'accepter ce principe.
Alessandra Iacobucci et Pierre Navaro