SToG (Stochastic Gates) is a PyTorch-based library designed for efficient and differentiable feature selection in neural networks. It implements various stochastic gating mechanisms that allow models to learn sparse feature representations end-to-end.
The library includes implementations of:
- STG (Stochastic Gates): Gaussian-based relaxation of Bernoulli gates.
- STE (Straight-Through Estimator): Hard thresholding with gradient approximation.
- Gumbel-Softmax: Categorical reparameterization for feature selection.
- Correlated STG: Handles multi-collinearity among features.
- L1 Regularization: Classic Lasso-style selection layer.
You can install the package directly from the source:
pip install SToG| Project Title: | Stochastic Gating for Robust Feature Selection |
|---|---|
| Project Type: | Research Project |
| Authors: | Eynullayev Altay, Firsov Sergey, Rubtsov Denis, Karpeev Gleb |
Feature selection is a crucial step in building interpretable and efficient machine learning models, especially in high-dimensional settings. This project investigates and implements stochastic gating mechanisms—a class of differentiable relaxation methods that enable gradient-based feature selection.
We provide a comprehensive library SToG, which allows researchers and practitioners to easily plug in feature selection layers into existing PyTorch architectures. The library supports various regularization techniques, handles correlated features, and provides a unified interface for benchmarking different selection strategies against standard baselines.
