Skip to content

BasisResearch/collab-splats

Repository files navigation

collab-splats

Extension tools for nerfstudio enabling depth/normal derivation and meshing (among other functions) for gaussian splatting.

For more details, see the paper reference and figures.

Installation

Docker

We provide a docker image setup for running nerfstudio with collab-splats (along with other abilities!) at tommybotch/collab-splats:latest

Once the docker image is loaded, please clone and install the repository as follows

pip install git+https://github.com/BasisResearch/collab-splats

Standalone setup w/o CUDA

git clone https://github.com/BasisResearch/collab-splats/
cd collab-splats
uv venv --python=3.10 && source .venv/bin/activate && uv pip install pip
bash setup_nocuda.sh

Building the docker image

The Docker image includes an example video file (C0043.MP4) downloaded from Google Cloud Storage during the build process. Follow these steps to build the image:

Prerequisites:

  1. Obtain a Google Cloud Storage service account key with access to the collab-data bucket
  2. Save the key as a JSON file

Build Steps:

  1. Place the service account key file:

    # Copy your GCS service account key to the build directory
    cp /path/to/your/service-account-key.json ./api-key.json

    Important: The key file MUST be named exactly api-key.json and placed in the same directory as the Dockerfile.

  2. Build the Docker image:

    docker build --platform=linux/amd64 -t collab-splats:latest .
  3. Clean up the key file after build:

    rm ./api-key.json

What happens during build:

  • The Docker build process installs rclone
  • Uses your service account key to configure GCS access
  • Downloads fieldwork_processed/2024_02_06-session_0001/SplatsSD/C0043.MP4 to /opt/data/ in the image
  • Securely removes all credentials from the final image
  • The final image contains rclone (without any stored credentials) and the example video file

Security Notes:

  • The service account key is only used during build time
  • No credentials are stored in the final Docker image
  • The key file and rclone configuration are completely removed after the download completes

Conda

Follow the NerfStudio instllation instructions to install a conda environment. For convenience, here are the commands I've used to successfully build a nerfstudio environment.

Note: This requires cuda developer tools -- specifically nvcc

Create an isolated conda environment (I've successfully built with python3.10)

# Set our system
export UBUNTU_VERSION=22.04
export NVIDIA_CUDA_VERSION=11.8.0

# You can remove some of these and fit them to your system needs 
export CUDA_ARCHITECTURES="90;89;86;80;75;70;61" 

conda create --name nerfstudio -y python=3.10
conda activate nerfstudio

Next install torch and torchvision built for cuda11.8 -- this specifically has to be run via pip for tinycuda-nn to detect the packages.

# Install torch and torchvision (from specified URL)
pip install torch==2.1.2+cu118 torchvision==0.16.2+cu118 --extra-index-url https://download.pytorch.org/whl/cu118

# Install cuda developer tools 
conda install -c 'nvidia/label/cuda-11.8.0' cuda-toolkit -y

Install hloc toolbox for SFM options.

# Install hloc
git clone --branch master --recursive https://github.com/cvg/Hierarchical-Localization.git /opt/hloc
cd /opt/hloc
git checkout v1.4
git submodule update --init --recursive
pip install -e . --no-cache-dir
cd ~

# Bump down for hloc interface
pip install --no-cache-dir pycolmap==0.4.0 

Downgrade setuptools to avoid tinycuda-nn error --> also need a numpy 1.X.X version

conda install -c conda-forge setuptools==69.5.1 'numpy<2.0.0'

Now is where pain begins... tinycuda-nn is the big snag point of installation -- it will also take the most amount of time.

# Note which CUDA architectures to build for
export TCNN_CUDA_ARCHITECTURES=${CUDA_ARCHITECTURES}

pip install -v ninja git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch

Install gsplat-rade and nerfstudio -- this gsplat version is required to run this code, as it contains the CUDA kernel for calculating depth and normal maps.

# Install specific gsplat version
pip install git+https://github.com/brian-xu/gsplat-rade.git

# Install nerfstudio from github (newer features available that are useful)
git clone https://github.com/nerfstudio-project/nerfstudio.git /opt/nerfstudio
cd /opt/nerfstudio
pip install . --no-cache-dir

# Bump the numpy version back down (nerfstudio upgrades for some reason)
conda install -c conda-forge 'numpy<2.0.0'
conda install -c conda-forge 'cmake>3.5' ninja gmp cgal ipykernel
pip install -r /tmp/requirements.txt"

Lastly, install collab-splats -- currently doing direct clone and egg installation due to private repository. For full functionality, you can optionally install collab-data

## If public repository could do -- pip install git+https://github.com/BasisResearch/collab-splats
git clone https://github.com/BasisResearch/collab-splats/
cd collab-splats

# Runs pip install -e .
bash setup.sh

# Optional install of collab-data
pip install git+https://github.com/BasisResearch/collab-data.git

Usage

collab-splats is built to integrate different gaussian splatting codebases that enable depth and normal map creation. Specifically, it implements the depth-normal consistency loss

Two models are currently offered:

  • rade-gs: the baseline extension model that enables depth and normal map creation within the rasterization process. This is built on top of gsplat-rade and is heavily inspired by the scaffold-gs-nerfstudio implementation.
  • rade-features: extends rade-gs to enable splatting of ANN feature spaces. This draws inspiration from the original feature-splatting-ns implementation but contains additional functionality.

Within the class Splatter we provide the ability to preprocess, train, and visualize splatting models within nerfstudio. We also enable meshing as a post-processing strategy for all splatting outputs.

For examples of these different functionalities, please navigate to the examples/ directory.

The Docker image contains an example splat video at /opt/data/C0043.MP4.

Problems

Things aren't showing up in plots? Check VSCode forwarding settings

About

Package for deriving gaussian splats, meshes, and associated semantic features

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •