Anand K Subramanian



Posts about #jax

29 Aug 2022

Improving the RANSAC Algorithm

Discussion about the MAGSAC algorithm, addressing a crucial hyperparameter selection issue for the RANSAC algorithm.

#math #ml #code #jax

29 Jun 2022

The Back-Gradient Trick

Stochastic Gradient Descent can be (kind of) reversed and can be used to compute gradients with respect to its hyperparameters.

#math #ml #gradient #graph #code #jax #deep-learning

31 May 2022

Parallelizing Kalman Filters

The associative property of Kalman (Bayesian) filters can yield a parallel algorithm in O(log N).

#math #ml #parallel #code #jax

© 2024 Anand K Subramanian License Design Built with Kutti