Skip to content

pcl3dv/LoopSparseGS

Repository files navigation

LoopSparseGS: Loop Based Sparse-View Friendly Gaussian Splatting

Zhenyu Bao1  Guibiao Liao1, *  Kaichen Zhou1  Kanglin Liu2  Qing Li2, *  Guoping Qiu3
1Peking University  2Pengcheng Laboratory  3University of Nottingham
*corresponding author

Paper | Project | Results

image


Brief Intro

3D Gaussian Splatting in Sparse Setting: Despite the photorealistic novel view synthesis (NVS) performance achieved by the original 3D Gaussian splatting (3DGS), its rendering quality significantly degrades with sparse input views. This performance drop is mainly caused by several challenges. Firstly, given the sparse input views, the initial Gaussian points provided by Structure from Motion (SfM) can be sparse and inadequate, as shown in follow figure (top left). Secondly, reconstructing the appearance and geometry of scenes becomes an under-constrained and ill-posed issue with insufficient inputs with only the image reconstruction constraints. Thirdly, the scales of some Gaussians grow to be very large during the optimization process, and these oversized Gaussian ellipsoids result in the overfitting problem, thus producing unsatisfactory results at novel viewpoints as illustrated in follow figure (top middle).


image


LoopSparseGS Method: LoopSparseGS is a loop-based 3DGS framework for the sparse novel view synthesis task. In specfic, we propose a loop-based Progressive Gaussian Initialization (PGI) strategy that could iteratively densify the initialized point cloud using the rendered pseudo images during the training process. Then, the sparse and reliable depth from the Structure from Motion, and the window-based dense monocular depth are leveraged to provide precise geometric supervision via the proposed Depth-alignment Regularization (DAR). Additionally, we introduce a novel Sparse-friendly Sampling (SFS) strategy to handle oversized Gaussian ellipsoids leading to large pixel errors.


Prerequisites

1.1 Equipment

We have validated our code on the following equipment:

  • Unbuntu 18.04 or Win 11
  • Py 3.8 + cuda 11.3 + torch 11.7
  • colmap >= 3.7

1.2 Setup

Please run the following commands to clone the repository and install the required packages. If COLMAP is not installed in your environment, please follow the official instructions.

git clone https://github.com/pcl3dv/LoopSparseGS.git
cd LoopSparseGS

conda env create --file environment.yml
conda activate loopsparsegs

1.3 Data Preparation

Download LLFF from offical link or mirrored link. Make sure the downloaded data is located at data/nerf_llff_data/scene.

Before training, please run the following code for acquiring the initial n-views colmap files and projected depths.

cd LoopSparseGS
python tools/pre_llff.py

Training

The training paradigm is: training -> looping -> training -> looping -> training ...

2.1 Training and Looping

Scripts: You can run the script file to train the LLFF scenes (e.g. fern).

cd LoopSparseGS
bash script/train_llff.sh fern

Parameter explanations: Key parameters used in the training (train.py) and looping (loop.py) phases are described below.

For train.py:

  • --train_sub: Number of training views.
  • --exp_name: The output files would be saved in output/${exp_name}.
  • --pseudo_loop_iters: Current loop round.
  • --pseudo_nums_per_img: The number of new pseudo-views around each training view.
  • --patch_length: The windows length in DAR method.
  • -sps/--sparse_sampling: Enable sparse-friendly sampling.

For loop.py:

  • -m/--model_path: Path to the latest training output directory.
  • -p/--pseudo_loop_iters: Loop round of the lastest training.

2.2 Render

The render process runs automatically at the end of each training phase. Use the following command if you need to render again manually.

python render.py -m output/fern --skip_train --render_depth

2.3 Evaluation

During the training process, we record metrics every 1000 epochs and store them in $model_path$/record_psnr.txt. The following code can be used for any additional processing needs.

python metrics.py -m ./output/fern

TODO list

  • release the code of the training and looping
  • release the code for training custom data
  • release the results reported on our paper.

Acknowledgement

This work was inspired by 3DGS and FSGS. Thanks for their great projects.

Citation

Cite as below if you find this repository is helpful to your project:

@article{bao2024loopsparsegs,
      title={LoopSparseGS: Loop Based Sparse-View Friendly Gaussian Splatting},
      author={Bao, Zhenyu and Liao, Guibiao and Zhou, Kaichen and Liu, Kanglin and Li, Qing and Qiu, Guoping},
      journal={arXiv preprint arXiv:2408.00254},
      year={2024},
    }

About

[IEEE TIP 2025]The official repository of LoopSparseGS.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •