Stochastic Second-Order Methods in JAX
-
Updated
Sep 23, 2024 - Python
Stochastic Second-Order Methods in JAX
Implementation of the paper "A Self-attention Ansatz for Ab-initio Quantum Chemistry" on torch.
Fisher Flow: A unified information-geometric framework for sequential inference revealing how modern optimizers (Adam, Natural Gradient, K-FAC, EWC) emerge as special cases of Fisher information propagation
Add a description, image, and links to the natural-gradient topic page so that developers can more easily learn about it.
To associate your repository with the natural-gradient topic, visit your repo's landing page and select "manage topics."