Helical builds the Virtual AI Lab for Biological Discovery. This open framework provides access to state-of-the-art Bio Foundation Models across genomics, transcriptomics, and single-cell data modalities.
Helical simplifies the entire lifecycle of applying Bio Foundation Models — from model access to fine-tuning and in-silico experimentation. With Helical's open-source framework, you can: • Leverage the latest Bio Foundation Models through a simple Python interface • Run example notebooks for key downstream tasks • Customize models and workflows for your own datasets and experiments
This repository is continuously updated with new models, benchmarks, and utilities. Join us in shaping the next generation of AI-powered biology.
Let’s build the most exciting AI-for-Bio community together!
We have integrated the Tahoe-x1 foundation model for single-cell RNA-seq data. This transformer-based model can extract both cell and gene embeddings from raw count data and supports attention weight extraction for interpretability. Try it out with our comprehensive tutorial notebook!
We have integrated the new Cell2Sentence-Scale models which use cell sentences as input and are based on the Gemma language model architecture (2B and 27B models available in quantised versions too). You can use this model for embeddings and perturbation prediction. Follow our notebook tutorial here.
We have integrated the new Geneformer models which are larger and have been trained on more data. Find out which models have been integrated into the Geneformer suite in the model card. Check out the our notebook on drug perturbation prediction using different Geneformer scalings here.
We have integrated TranscriptFormer into our helical package and have made a model card for it in our Transcriptformer model folder. If you would like to test the model, take a look at our example notebook!
We’re thrilled to announce the release of our first-ever mRNA Bio Foundation Model, designed to:
- Be Efficient, handling long sequence lengths effortlessly
- Balance Diversity & Specificity, leveraging a 2-step pre-training approach
- Deliver High-Resolution, using single nucleotides as a resolution
Check out our blog post to learn more about our approach and read the model card to get started.
We recommend installing Helical within a conda environment with the commands below (run them in your terminal) - this step is optional:
conda create --name helical-package python=3.11.13
conda activate helical-package
To install the latest pip release of our Helical package, you can run the command below:
pip install helical
Note Sometimes Torch is not installed as the CUDA compiled version (e.g. on different architectures) which is why you need to manually install Helical with GPU support, run the command below (or install pytorch with cuda first and then install helical):
pip install helical --extra-index-url https://download.pytorch.org/whl/cuXXX (replace XXX with your cuda version, e.g. 128 for cuda 12.8)
To install the latest Helical package, you can run the command below:
pip install --upgrade git+https://github.com/helicalAI/helical.git
Alternatively, clone the repo and install it:
git clone https://github.com/helicalAI/helical.git
pip install .
###Flash Attention Support To enable Flash Attention (required by some models), run the command below:
pip install flash-attn --no-build-isolation
Important Make sure that your Pytorch CUDA Version matches your system CUDA version, especially when using flash-attn.
###Mamba-SSM Model Installation [Optional] To install mamba-ssm and causal-conv1d use the command below:
pip install helical[mamba-ssm]
or in case you're installing from the Helical repo cloned locally:
pip install .[mamba-ssm]
###Evo2 Model Installation To install Evo2 Specifically, follow the instructions in the evo-2 model card.
To install Tahoe-X1 do the following after installing helical:
pip install helical[tahoe]
- Make sure your machine has GPU(s) and Cuda installed. Currently this is a requirement for the packages mamba-ssm and causal-conv1d.
- The package
causal_conv1drequirestorchto be installed already. First installinghelicalseparately (without[mamba-ssm]) will installtorchfor you. A second installation (with[mamba-ssm]), installs the packages correctly. - If you have problems installing
mamba-ssm, you can install the package via the provided.whlfiles on their release page here. Choose the package according to your cuda, torch and python version:
pip install https://github.com/state-spaces/mamba/releases/download/v2.2.4/mamba_ssm-2.2.4+cu12torch2.3cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
- Now continue with
pip install .[mamba-ssm]to also install the remainingcausal-conv1d.
If you desire to run your code in a singularity file, you can use the singularity.def file and build an apptainer with it:
apptainer build --sandbox singularity/helical singularity.def
and then shell into the sandbox container (use the --nv flag if you have a GPU available):
apptainer shell --nv --fakeroot singularity/helical/
To run examples, be sure to have installed the Helical package (see Installation) and that it is up-to-date.
You can look directly into the example folder above and download the script of your choice, look into our documentation for step-by-step guides or directly clone the repository using:
git clone https://github.com/helicalAI/helical.git
Within the examples/notebooks folder, open the notebook of your choice. We recommend starting with Quick-Start-Tutorial.ipynb
| Example | Description | Colab |
|---|---|---|
| Quick-Start-Tutorial.ipynb | A tutorial to quickly get used to the helical package and environment. | |
| Helix-mRNA.ipynb | An example of how to use the Helix-mRNA model. | |
| Geneformer-vs-TranscriptFormer.ipynb | Zero-Shot Reference Mapping with Geneformer & TranscriptFormer and compare the outcomes. | |
| Hyena-DNA-Inference.ipynb | An example how to do probing with HyenaDNA by training a neural network on 18 downstream classification tasks. | |
| Cell-Type-Annotation.ipynb | An example how to do probing with scGPT by training a neural network to predict cell type annotations. | |
| Cell-Type-Classification-Fine-Tuning.ipynb | An example how to fine-tune different models on classification tasks. | |
| HyenaDNA-Fine-Tuning.ipynb | An example of how to fine-tune the HyenaDNA model on downstream benchmarks. | |
| Cell-Gene-Cls-embedding-generation.ipynb | A notebook explaining the different embedding modes of single cell RNA models. | |
| Geneformer-Series-Comparison.ipynb | A zero shot comparison between Geneformer model scaling on drug perturbation prediction | |
| Cell2Sen-Tutorial.ipynb | An example tutorial of how to use cell2sen models for embeddings and perturbation predictions. | |
| Tahoe-x1-Tutorial.ipynb | A comprehensive tutorial on using the Tahoe-x1 model for extracting cell and gene embeddings, with attention visualization. |
We are eager to help you and interact with you:
- Join our Slack channel where you can discuss applications of bio foundation models.
- You can also open Github issues here.
If you are (or plan to) working with bio foundation models s.a. Geneformer or UCE on RNA and DNA data, Helical will be your best buddy! We provide and improve on:
- Up-to-date model library
- A unified API for all models
- User-facing abstractions tailored to computational biologists, researchers & AI developers
- Innovative use case and application examples and ideas
- Efficient data processing & code-base
We will continuously upload the latest model, publish benchmarks and make our code more efficient.
We welcome all kinds of contributions, including code, documentation, bug reports, and feature suggestions. Please read our Contributing Guidelines to help us keep the project organized and collaborative.
A lot of our models have been published by talented authors developing these exciting technologies. We sincerely thank the authors of the following open-source projects:
- scGPT
- Geneformer
- UCE
- TranscriptFormer
- HyenaDNA
- Cell2Sen
- Tahoe-X1
- llm-foundry
- composer
- anndata
- scanpy
- transformers
- scikit-learn
- GenePT
- Caduceus
- Evo2
- torch
- torchvision
You can find the Licenses for each model implementation in the model repositories:
Please use this BibTeX to cite this repository in your publications:
@software{allard_2024_13135902,
author = {Helical Team},
title = {helicalAI/helical: v1.1.0},
month = nov,
year = 2024,
publisher = {Zenodo},
version = {1.1.0},
doi = {10.5281/zenodo.13135902},
url = {https://doi.org/10.5281/zenodo.13135902}
}