LoopSparseGS

Loop Based Sparse-View Friendly Gaussian Splatting

1Peking University, 2Pengcheng Laboratory, 3University of Nottingham

Abstract

Despite the photorealistic novel view synthesis (NVS) performance achieved by the original 3D Gaussian splatting (3DGS), its rendering quality significantly degrades with sparse input views. This performance drop is mainly caused by the limited number of initial points generated from the sparse input, insufficient supervision during the training process, and inadequate regularization of the oversized Gaussian ellipsoids. To handle these issues, we propose the LoopSparseGS, a loop-based 3DGS framework for the sparse novel view synthesis task. In specific, we propose a loop-based Progressive Gaussian Initialization (PGI) strategy that could iteratively densify the initialized point cloud using the rendered pseudo images during the training process. Then, the sparse and reliable depth from the Structure from Motion, and the window-based dense monocular depth are leveraged to provide precise geometric supervision via the proposed Depth-alignment Regularization (DAR). Additionally, we introduce a novel Sparse-friendly Sampling (SFS) strategy to handle oversized Gaussian ellipsoids leading to large pixel errors. Comprehensive experiments on four datasets demonstrate that LoopSparseGS outperforms existing state-of-the-art methods for sparse-input novel view synthesis, across indoor, outdoor, and object-level scenes with various image resolutions.


Comparison Results

3DGS
DNGaussian
FSGS
LoopSparseGS

Ablation Results

3DGS
+ PGI
+ PGI + DAR
+ PGI + DAR + SFS

More Visualizations Results (LLFF, 3 Training View)

fern
flower
fortress
horns
leaves
orchids
room
trex

More Visualizations Results (Mip-NeRF360, 24 Training View)

bicycle
bonsai
counter
bicycle depth
bonsai depth
counter depth
garden
kitchen
room
garden depth
kitchen depth
room depth

BibTeX

@article{bao2024loopsparsegs,
      title={LoopSparseGS: Loop Based Sparse-View Friendly Gaussian Splatting},
      author={Bao, Zhenyu and Liao, Guibiao and Zhou, Kaichen and Liu, Kanglin and Li, Qing and Qiu, Guoping},
      journal={arXiv preprint arXiv:2408.00254},
      year={2024},
    }