Skip to content

matze298/resnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ResNet

Modular ResNet implementation using tensorflow. PreActivation approach is used for all residual or bottleneck-blocks.

Currently, the ResNet-110 for CIFAR-10 is implemented. The first goal is to match the reported results of the paper on CIFAR-10 (see table below).

Results

The training set is split into 45k training images and 5k validation images to determine the training parameters and global_step for early stopping. Afterwards, it is retrained using the full training dataset. Results are reported on the test set.

Dataset Layers Results Optimizer Reported Results
CIFAR-10 110 83.28% Adam 93.67 %
CIFAR-10 110 91.58% Momentum-SGD 93.67 %

Differences to paper

  • batch size 32 due to GPU-limitations
  • Learning rate schedule adapted due to different batch size

About

Modular ResNet implementation using tensorflow.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages