Skip to content

tokkiwa/S2CFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

S2CFormer Reimplementation

This is an UNOFFICIAL re-implementation of S2CFormer in Pytorch. The original paper is below:

Y. Chen, Q. Li, B. He, D. Feng, R. Wu, Q. Wang, L. Song, G. Lu, and W. Zhang, "S2CFormer: Revisiting the RD-Latency Trade-off in Transformer-based Learned Image Compression," 2025, arXiv:2502.00700.

Installation

git clone http://github.com/tokkiwa/S2CFormer.git --recursive
cd S2CFormer
pip install compressai timm

Note that this code depends on the ELIC repository as a submodule. This ELIC is originally from JiangWeiBeta's implementation, but I added a slight modification for version compatibility.

If you did not clone with --recursive, please run:

git submodule update --init --recursive

Trained models

I trained S2CFormer-Large of four different bitrates on MLIC-Train Dataset for ~2M iterations, following the standard training procedure described in the paper. Note that LR is set to 1e-4, then decayed to 1e-5 at 1.73M iterations, and to 1e-6 at 1.86M iterations.

Checkpoints are available here.

Trainable parameters

Model Params (This Code) Official Paper
S2CFormerIdentity 61,525,548 64.63M
S2CFormerConv 64,659,500 66.60M
S2CFormerAttention 67,492,652 68.42M
S2CFormerHybridS 64,638,508 68.00M
S2CFormerHybridM 69,383,212 72.73M
S2CFormerHybridL 76,500,268 79.83M

Quick eval (Kodak)

python3 eval.py \
  --cuda \
  --checkpoint /path/to/hybrid_l_0.013S2CFormerHybridL_checkpoint_best.noctx0.pth.tar \
  --data /path/to/kodak \
  --model_type hybridl \
  --real

Citation

@article{arxiv:2502.00700,
  author  = {Yunuo Chen and Qian Li and Bing He and Donghui Feng and Ronghua Wu and Qi Wang and Li Song and Guo Lu and Wenjun Zhang},
  title   = {S2CFormer: Revisiting the RD-Latency Trade-off in Transformer-based Learned Image Compression},
  journal = {arXiv preprint arXiv:2502.00700},
  year    = {2025},
  url     = {https://arxiv.org/pdf/2502.00700v3}
}

About

Unofficial Pytorch Implementation of S2CFormer

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages