FilterGS
[CVPR'26] FilterGS: Traversal-Free Parallel Filtering and Adaptive Shrinkage for Large Scale LoD 3D Gaussian Splatting
Install / Use
/learn @xenon-w/FilterGSREADME
<!--   [](license) -->
[CVPR'26] FilterGS utilizes a single RTX 4090 for training highly realistic urban-scale models and for their real-time rendering. Visit our project page for more demos.
Our code is built upon PyTorch and leverages gaussian-splatting and LoG techniques.
Quick Start & Dataset Preparation
For a smooth setup, follow the installation guide.
We employ COLMAP to prepare the dataset. Refer to the preprocessing documentation for detailed instructions.
Training
Training the model is as simple as one command. Note: You need to modify the model path in the train.yml file.
python3 apps/train.py --cfg config/GauUScene/college/train.yml split train
We automatically configure heuristic parameters based on the dataset size.
Rendering
Before rendering, two steps are required: calculating the ancestor path of the model, and computing the Gaussian redundancy of the scene. Execute the following two commands respectively, which takes about 10 minutes in total:
# ancestor path
python apps/ancestor.py --ckpt /PATH/TO/YOUR/MODEL/.pth --out /OUTPUT/ANCESTOR/PATH/model_ancestor.pth
# KPC & GTC
python3 apps/render.py --cfg config/GauUScene/college/render.yml --debug
Note: Each scene only needs to be processed once with the above steps. For subsequent rendering runs, execute the command below:
python3 apps/render.py --cfg config/GauUScene/college/render.yml
--skip-save and --debug are optional parameters; --skip-save skips saving rendered images, while --debug outputs detailed parameters of the rendering process.
Acknowledgements
We acknowledge the following inspirational prior work:
Contributions are warmly welcomed! If you've made significant progress on any of these fronts, please consider submitting a pull request. If you have any questions, please feel free to point them out in the issue section.
Citation
Our paper will be available within a week. We greatly appreciate your early interest in this work!
