Skip to content

SamuelHorvath/Variance_Reduced_Optimizers_Pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Variance Reduced Optimizers in PyTorch

This repo contains implementation of SVRG, SARAH (SpiderBoost), SCSG and Geom-SARAH algorithms based on PyTorch. It was used to produce experiments for the paper Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization (Horvath et al., 2020).

To replicate our experiments, first, recreate conda environment from environment.yml. Run scripts are available in runs/ directory.

If you find this useful, please consider citing:

@article{horvath2020adaptivity,
  title={Adaptivity of stochastic gradient methods for nonconvex optimization},
  author={Horv{\'a}th, Samuel and Lei, Lihua and Richt{\'a}rik, Peter and Jordan, Michael I},
  journal={arXiv preprint arXiv:2002.05359},
  year={2020}
}

References:

About

PyTorch Implementation of Variance Reduced Optimization Algorithms -- SARAH and SVRG.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors