Existing 4D Gaussian Splatting (4DGS) methods struggle to accurately reconstruct dynamic scenes, often failing to resolve ambiguous pixel correspondences and inadequate densification in dynamic regions. We address these issues by introducing a novel method composed of two key components: (1) Elliptical Error Clustering and Error Correcting Splat Addition that pinpoints dynamic areas to improve and initialize fitting splats, and (2) Grouped 4D Gaussian Splatting that improves consistency of mapping between splats and represented dynamic objects. Specifically, we classify rendering errors into missing-color and occlusion types, then apply targeted corrections via backprojection or foreground splitting guided by cross-view color consistency. Evaluations on Neural 3D Video and Technicolor datasets demonstrate that our approach significantly improves temporal consistency and achieves state-of-the-art perceptual rendering quality, improving 0.39dB of PSNR on the Technicolor Light Field dataset. Our visualization shows improved alignment between splats and dynamic objects, and the error correction method's capability to identify errors and properly initialize new splats.
We visualize the correspondence between 4D Gaussian splats and the underlying dynamic objects. Our method achieves better alignment and coherence compared to baseline methods.
Comparison with state-of-the-art methods on various sequences.
Free view rendering results demonstrating the temporal consistency and rendering quality of our method.
@inproceedings{kang2025cem4dgs,
title = {Clustered Error Correction with Grouped 4D Gaussian Splatting},
author = {Kang, Taeho and Park, Jaeyeon and Lee, Kyungjin and Lee, Youngki},
booktitle = {SIGGRAPH Asia 2025 Conference Papers},
month = {December},
year = {2025},
pages = {1--12},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
doi = {10.1145/3757377.3763858},
url = {https://doi.org/10.1145/3757377.3763858}
}