AI
R
Python
Julia
Git
Bash/Zsh
Emacs
Tools
Bayesian inference in JAX
AI
PyTorch
Introduction
Which framework to choose?
High-level frameworks
Introduction to NN
The PyTorch API
PyTorch tensors
Automatic differentiation
Workflow
Creating checkpoints
Example: the MNIST
DL on production clusters
Resources
JAX
Why JAX?
How does it work?
Installation
Running jobs
Relation to NumPy
JIT compilation
Benchmarking JAX code
Accelerators
Pytrees
Automatic differentiation
Parallel computing
Pushing optimizations further
Libraries on top of JAX
Resources
Flax
Deep learning with JAX
Installation & setup
Flax’s handling of model states
Loading datasets
Model architecture
Training on Alliance clusters
Resources
Scikit-learn
What is scikit-learn?
Sklearn workflow
Workshops
Audio DataLoader with PyTorch
Finding pre-trained models
Intro ML for the humanities
Quick intro to DL, NLP, and LLMs
Webinars
Map of current ML frameworks
Data & model version control
Accelerated array & AD with JAX
Image upscaling
AI-powered coding with Copilot
Easier PyTorch with fastai
DL in Julia with Flux
PyTorch tensors in depth
Bayesian inference in JAX
Bayesian inference in JAX
Author
Marie-Hélène Burle
Coming up on February 25.
PyTorch tensors in depth