Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 0 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,6 @@

# XLB: A Differentiable Massively Parallel Lattice Boltzmann Library in Python for Physics-Based Machine Learning

🎉 **Exciting News!** 🎉 XLB version 0.2.0 has been released, featuring a complete rewrite of the library and introducing support for the NVIDIA Warp backend!
XLB can now be installed via pip: `pip install xlb`.

XLB is a fully differentiable 2D/3D Lattice Boltzmann Method (LBM) library that leverages hardware acceleration. It supports [JAX](https://github.com/google/jax) and [NVIDIA Warp](https://github.com/NVIDIA/warp) backends, and is specifically designed to solve fluid dynamics problems in a computationally efficient and differentiable manner. Its unique combination of features positions it as an exceptionally suitable tool for applications in physics-based machine learning. With the new Warp backend, XLB now offers state-of-the-art performance for even faster simulations.

## Getting Started
Expand Down
14 changes: 7 additions & 7 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

setup(
name="xlb",
version="0.2.1",
version="0.3.0",
description="XLB: Accelerated Lattice Boltzmann (XLB) for Physics-based ML",
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
Expand All @@ -15,16 +15,16 @@
"numpy>=2.1.2",
"pyvista>=0.44.1",
"trimesh>=4.4.9",
"warp-lang>=1.4.0",
"warp-lang>=1.10.0",
"numpy-stl>=3.1.2",
"pydantic>=2.9.1",
"ruff>=0.6.5",
"jax>=0.4.34", # Base JAX CPU-only requirement
"ruff>=0.14.1",
"jax>=0.8.0", # Base JAX CPU-only requirement
],
extras_require={
"cuda": ["jax[cuda12]>=0.4.34"], # For CUDA installations
"tpu": ["jax[tpu]>=0.4.34"], # For TPU installations
"cuda": ["jax[cuda13]>=0.8.0"], # For CUDA installations
"tpu": ["jax[tpu]>=0.8.0"], # For TPU installations
},
python_requires=">=3.10",
python_requires=">=3.11",
dependency_links=["https://storage.googleapis.com/jax-releases/libtpu_releases.html"],
)
8 changes: 7 additions & 1 deletion xlb/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
from importlib.metadata import PackageNotFoundError, version

try:
__version__ = version("xlb")
except PackageNotFoundError:
__version__ = "0.0.0"

# Enum classes
from xlb.compute_backend import ComputeBackend as ComputeBackend
from xlb.precision_policy import PrecisionPolicy as PrecisionPolicy, Precision as Precision
Expand All @@ -15,7 +22,6 @@
import xlb.operator.stream
import xlb.operator.boundary_condition
import xlb.operator.macroscopic
import xlb.operator.immersed_boundary
import xlb.operator.postprocess

# Grids
Expand Down