Acknolwedge to Angel Aaron Flores Alberca and
For a mental model on how this repository was created look PsiFormer
- What this does
- Key ideas (short)
- Quick install (CPU-only)
- Usage
- Files and layout
- Example workflow
- Tips and caveats
- Reproducibility
- References and further reading
- License
This repository demonstrates an experimental approach to approximating solutions to the many-electron Schrödinger equation using transformer architectures. It contains reference code and utilities to build and run small-scale experiments on CPU (no CUDA required).
- Uses transformer-style attention to model electronic wavefunction structure.
- Provides utilities and example scripts to run toy problems and evaluate model performance against simple quantum chemistry references.
This project is intended for research and learning. It is not production-ready for large-scale quantum chemistry simulations but can be used to prototype ideas and compare modeling choices.
- Represent electronic configurations or basis expansions as sequences and learn interactions using attention layers.
- Use permutation-equivariant input encodings or ordered basis sequences to capture antisymmetry constraints through learned modules and loss terms.
- Train with physics-informed losses (energy expectation, cusp conditions, or density overlaps) and supervised or self-supervised pretraining.
If you use uv (fast installer), you can install a CPU-only PyTorch wheel and
other dependencies like this:
# install uv if you don't have it
pip install -U uv
# install CPU-only pytorch (from official PyTorch CPU index)
uv install torch --index-url https://download.pytorch.org/whl/cpu
# install other lightweight deps (edit as needed)
uv install -r requirements.txt || pip install -r requirements.txtIf you prefer plain pip for CPU-only PyTorch:
pip install --index-url https://download.pytorch.org/whl/cpu torch
pip install -r requirements.txtNote: The code in this repo sets PyTorch's default device to CPU; no CUDA setup is required.
- Prepare data / basis encodings for the target system.
- Edit
config.pyto set model size, number of heads, block (sequence) length, and training hyperparameters. - Run the training/experiment script (example):
python src/main.py --config configs/toy_system.yamlReplace the example command above with your own runner or flags as needed.
src/— main source codemain.py— training / experiment entrypointutils.py— helper utilities, device selection, etc.config.py— config and hyperparameters
pyproject.toml— Python project metadatatemplate.tex— report template (optional)
- Choose a small system (2–10 electrons) and a compact basis set.
- Build sequence encodings for orbitals/electron coordinates.
- Train the transformer to predict minimal-energy coefficients or approximate the wavefunction amplitude for sampled configurations.
- Evaluate energy and compare against reference (Hartree–Fock, small CI).
- Enforcing antisymmetry exactly (Slater determinants) is nontrivial when using sequence models; consider hybrid approaches (learn corrections to a Slater determinant) or include antisymmetry in the loss.
- Transformers scale quadratically with sequence length; start with small basis sizes and toy molecules.
- Use physics-informed losses wherever possible to improve sample efficiency.
- Pin dependencies in
requirements.txtif you need strict reproducibility. - Seed RNGs at the start of experiments (
torch.manual_seed,random.seed).
- Vaswani et al., "Attention Is All You Need" (transformers)
- Recent literature on machine learning for quantum chemistry and wavefunction modeling (e.g., Neural quantum states, FermiNet, PauliNet).
This project is provided under the repository license.