Skip to content

SToG is an educational project for the course Bayesian Multimodeling by Eynullayev Altay, Rubtsov Denis, Firsov Sergey and Karpeev Gleb. SToG - is a library for feature selection using different stochastic gating approaches.

License

Notifications You must be signed in to change notification settings

intsystems/SToG

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

75 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SToG: Stochastic Gates for Feature Selection

SToG Logo

Test status Docs status

SToG (Stochastic Gates) is a PyTorch-based library designed for efficient and differentiable feature selection in neural networks. It implements various stochastic gating mechanisms that allow models to learn sparse feature representations end-to-end.

The library includes implementations of:

  • STG (Stochastic Gates): Gaussian-based relaxation of Bernoulli gates.
  • STE (Straight-Through Estimator): Hard thresholding with gradient approximation.
  • Gumbel-Softmax: Categorical reparameterization for feature selection.
  • Correlated STG: Handles multi-collinearity among features.
  • L1 Regularization: Classic Lasso-style selection layer.

Quick Links

Installation

You can install the package directly from the source:

pip install SToG

Project Information

Project Title:Stochastic Gating for Robust Feature Selection
Project Type:Research Project
Authors:Eynullayev Altay, Firsov Sergey, Rubtsov Denis, Karpeev Gleb

Abstract

Feature selection is a crucial step in building interpretable and efficient machine learning models, especially in high-dimensional settings. This project investigates and implements stochastic gating mechanisms—a class of differentiable relaxation methods that enable gradient-based feature selection.

We provide a comprehensive library SToG, which allows researchers and practitioners to easily plug in feature selection layers into existing PyTorch architectures. The library supports various regularization techniques, handles correlated features, and provides a unified interface for benchmarking different selection strategies against standard baselines.

About

SToG is an educational project for the course Bayesian Multimodeling by Eynullayev Altay, Rubtsov Denis, Firsov Sergey and Karpeev Gleb. SToG - is a library for feature selection using different stochastic gating approaches.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages