Bart Kuipers, Freek Byrman, Daniel Uyterlinde, and Alejandro García-Castellanos
University of Amsterdam
This repository builds upon the codebase from the paper: Scale Equivariant Graph Metanetworks
To extract the CNN dataset move to the directory data/ and execute:
tar -xvf cifar10.tar.xz
# download cifar10 splits
wget https://github.com/AllanYangZhou/nfn/raw/refs/heads/main/experiments/predict_gen_data_splits/cifar10_split.csv -O data/cifar10/cifar10_split.csvThe directory configs/cifar10 contains six YAML configuration files.
Each corresponds to an optimal hyperparameter setting found via grid search.
Example:
configs/
└── cifar10/
├── scalegmn_optimizer_cnn_tanh.yml
├── scalegmn_optimizer_mlp_tanh.yml
└── ...
To start training, run the following commands:
python -u scalegmn_optimizer_cnn_train.py --conf configs/cifar10/scalegmn_optimizer_cnn_tanh.yml
python -u scalegmn_optimizer_mlp_train.py --conf configs/cifar10/scalegmn_optimizer_mlp_tanh.ymlOnce the model is trained, you can evaluate it on the test set using:
python -u scalegmn_optimizer_cnn_test.py --conf configs/cifar10/scalegmn_optimizer_cnn_tanh.yml
python -u scalegmn_optimizer_mlp_test.py --conf configs/cifar10/scalegmn_optimizer_mlp_tanh.ymlTo break the scale equivariance by omitting the canonicalization step simply run the same commands with the --scalegmn_args.<cnn/mlp>_args.break_symmetry flag:
python -u scalegmn_optimizer_cnn_train.py --conf configs/cifar10/scalegmn_optimizer_cnn_tanh.yml --scalegmn_args.cnn_args.break_symmetry
python -u scalegmn_optimizer_mlp_train.py --conf configs/cifar10/scalegmn_optimizer_mlp_tanh.yml --scalegmn_args.mlp_args.break_symmetryIf you find this work helpful, please cite the original paper:
@misc{kuipers2025symmetryawarefullyamortizedoptimizationscale,
title={Symmetry-Aware Fully-Amortized Optimization with Scale Equivariant Graph Metanetworks},
author={Bart Kuipers and Freek Byrman and Daniel Uyterlinde and Alejandro García-Castellanos},
year={2025},
eprint={2510.08300},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2510.08300},
}