Anand K Subramanian



Posts about #deep-learning

29 Jun 2022

The Back-Gradient Trick

Stochastic Gradient Descent can be (kind of) reversed and can be used to compute gradients with respect to its hyperparameters.

#math #ml #gradient #graph #code #jax #deep-learning

18 Jun 2020

A Cleverer Trick on top of the Reparametrization Trick

Implicit differentiation can lead to an efficient computation of the gradient of reparametrized samples.

#math #ml #gradient #deep-learning

08 Jun 2020

A Trick for Computing the gradient of Expectation

A trick for interchanging the gradient and Expectation of a function under the Gaussian distribution.

#math #ml #gradient #deep-learning

18 Aug 2019

Segue from Euclidean Gradient Descent to Natural Gradient Descent

A slight change in SGD formulation, in terms of maximization of local approximation, leads to an interesting general connection to NGD via mirror descent.

#math #ml #gradient #natural-gradient #deep-learning

08 Aug 2019

A Fascinating connection between Natural Gradients and the Exponential Family

The Exponential family provides an elegant and easy method to compute Natural Gradients and thus can be used for Variational Inference.

#math #ml #gradient #natural-gradient #deep-learning

© 2024 Anand K Subramanian License Design Built with Kutti