Code that reproduces the experiments in the technical report:
G. Papamakarios, Comparison of Modern Stochastic Optimization Algorithms, Technical report, University of Edinburgh, 2014. [pdf] [bibtex]
The experiments benchmark four optimization algorithms on two convex problems. The algorithms are:
- Batch gradient descent
- Stochastic gradient descent
- Semi-stochastic gradient descent
- Stochastic average gradient
And the tasks are:
- Logistic regression on synthetic data
- Softmax regression on the MNIST dataset of handwritten images
First, run install.m to add all necessary paths to matlab's path. Then all scripts and functions in this folder will become executable.
-
Run
gen_synth_data.mto generate a synthetic dataset for logistic regression. Modify parametersNandDto change number of datapoints and dimesions respectively. -
Run
benchmark_logistic_synth.mto benchmark all algorithms on the synthetic dataset. Results are written in theresultsfolder.
- Download the following files from the MNIST website:
- train-images-idx3-ubyte.gz
- train-labels-idx1-ubyte.gz
- t10k-images-idx3-ubyte.gz
- t10k-labels-idx1-ubyte.gz
-
Unzip them and place them in the folder
data/mnistand runprepare_mnist_data.m. -
Run
benchmark_softmax_mnist.mto benchmark all algorithms on MNIST. Results are written in theresultsfolder.
-
install.m: the script you need to run before you do anything else. -
opt: contains implementations of the four optimization algorithms: GD, SGD, S2GD and SAG. -
data: contains scripts for generating synthetic data and preparing the MNIST data. These datasets are needed for the benchmarks. -
benchmarks: contains scripts that run experiments. Datasets must have been generated first. These scripts save and plot results. -
util: some utility functions used throughout the project.