SkillAgentSearch skills...

SuGaR

[CVPR 2024] Official PyTorch implementation of SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh Reconstruction and High-Quality Mesh Rendering

Install / Use

/learn @Anttwo/SuGaR

README

<div align="center">

SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh Reconstruction and High-Quality Mesh Rendering

<font size="4"> CVPR 2024 </font> <br> <font size="4"> <a href="https://anttwo.github.io/" style="font-size:100%;">Antoine Guédon</a>&emsp; <a href="https://vincentlepetit.github.io/" style="font-size:100%;">Vincent Lepetit</a>&emsp; </font> <br> <font size="4"> LIGM, Ecole des Ponts, Univ Gustave Eiffel, CNRS </font>

| <a href="https://anttwo.github.io/sugar/">Webpage</a> | <a href="https://arxiv.org/abs/2311.12775">arXiv</a> | <a href="https://github.com/Anttwo/sugar_frosting_blender_addon/">Blender add-on</a> | <a href="https://www.youtube.com/watch?v=MAkFyWfiBQo">Presentation video</a> | <a href="https://www.youtube.com/watch?v=YbjE0wnw67I">Viewer video</a> |

<img src="./media/examples/walk.gif" alt="walk.gif" width="350"/><img src="./media/examples/attack.gif" alt="attack.gif" width="350"/> <br> <b>Our method extracts meshes from 3D Gaussian Splatting reconstructions and builds hybrid representations <br>that enable easy composition and animation in Gaussian Splatting scenes by manipulating the mesh.</b>

</div>

Abstract

We propose a method to allow precise and extremely fast mesh extraction from <a href="https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/">3D Gaussian Splatting (SIGGRAPH 2023)</a>. Gaussian Splatting has recently become very popular as it yields realistic rendering while being significantly faster to train than NeRFs. It is however challenging to extract a mesh from the millions of tiny 3D Gaussians as these Gaussians tend to be unorganized after optimization and no method has been proposed so far. Our first key contribution is a regularization term that encourages the 3D Gaussians to align well with the surface of the scene. We then introduce a method that exploits this alignment to sample points on the real surface of the scene and extract a mesh from the Gaussians using Poisson reconstruction, which is fast, scalable, and preserves details, in contrast to the Marching Cubes algorithm usually applied to extract meshes from Neural SDFs. Finally, we introduce an optional refinement strategy that binds Gaussians to the surface of the mesh, and jointly optimizes these Gaussians and the mesh through Gaussian splatting rendering. This enables easy editing, sculpting, rigging, animating, or relighting of the Gaussians using traditional softwares (Blender, Unity, Unreal Engine, etc.) by manipulating the mesh instead of the Gaussians themselves. Retrieving such an editable mesh for realistic rendering is done within minutes with our method, compared to hours with the state-of-the-art method on neural SDFs, while providing a better rendering quality in terms of PSNR, SSIM and LPIPS.

<div align="center"> <b>Hybrid representation (Mesh + Gaussians on the surface)</b><br> <img src="./media/overview/garden_hybrid.gif" alt="garden_hybrid.gif" width="250"/> <img src="./media/overview/kitchen_hybrid.gif" alt="kitchen_hybrid.gif" width="250"/> <img src="./media/overview/counter_hybrid.gif" alt="counter_hybrid.gif" width="250"/><br> <img src="./media/overview/playroom_hybrid.gif" alt="playroom_hybrid.gif" width="323"/> <img src="./media/overview/qant03_hybrid.gif" alt="qant03_hybrid.gif" width="323"/> <img src="./media/overview/dukemon_hybrid.gif" alt="_hybrid.gif" width="102"/><br> <b>Underlying mesh without texture</b><br> <img src="./media/overview/garden_notex.gif" alt="garden_notex.gif" width="250"/> <img src="./media/overview/kitchen_notex.gif" alt="kitchen_notex.gif" width="250"/> <img src="./media/overview/counter_notex.gif" alt="counter_notex.gif" width="250"/><br> <img src="./media/overview/playroom_notex.gif" alt="playroom_notex.gif" width="323"/> <img src="./media/overview/qant03_notex.gif" alt="qant03_notex.gif" width="323"/> <img src="./media/overview/dukemon_notex.gif" alt="dukemon_notex.gif" width="102"/><br> </div>

BibTeX

@article{guedon2023sugar,
  title={SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh Reconstruction and High-Quality Mesh Rendering},
  author={Gu{\'e}don, Antoine and Lepetit, Vincent},
  journal={CVPR},
  year={2024}
}

Updates and To-do list

<details> <summary><span style="font-weight: bold;">Updates</span></summary> <ul> <li><b>[09/18/2024]</b> Improved the quality of the extracted meshes with the new `dn_consistency` regularization method, and added compatibility with the new Blender add-on for composition and animation. </li> <li><b>[01/09/2024]</b> Added a dedicated, real-time viewer to let users visualize and navigate in the reconstructed scenes (hybrid representation, textured mesh and wireframe mesh).</li> <li><b>[12/20/2023]</b> Added a short notebook showing how to render images with the hybrid representation using the Gaussian Splatting rasterizer.</li> <li><b>[12/18/2023]</b> Code release.</li> </ul> </details><br> <details> <summary><span style="font-weight: bold;">To-do list</span></summary> <ul> <li><b>Viewer:</b> Add option to load the postprocessed mesh.</li> <li><b>Mesh extraction:</b> Add the possibility to edit the extent of the background bounding box.</li> <li><b>Tips&Tricks:</b> Add to the README.md file (and the webpage) some tips and tricks for using SuGaR on your own data and obtain better reconstructions (see the tips provided by user kitmallet).</li> <li><b>Improvement:</b> Add an <code>if</code> block to <code>sugar_extractors/coarse_mesh.py</code> to skip foreground mesh reconstruction and avoid triggering an error if no surface point is detected inside the foreground bounding box. This can be useful for users that want to reconstruct "<i>background scenes</i>". </li> <li><b>Using precomputed masks with SuGaR:</b> Add a mask functionality to the SuGaR optimization, to allow the user to mask out some pixels in the training images (like white backgrounds in synthetic datasets). </li> <li><b>Using SuGaR with Windows:</b> Adapt the code to make it compatible with Windows. Due to path-writing conventions, the current code is not compatible with Windows. </li> <li><b>Synthetic datasets:</b> Add the possibility to use the NeRF synthetic dataset (which has a different format than COLMAP scenes) </li> <li><b>Composition and animation:</b> Finish to clean the code for composition and animation, and add it to the <code>sugar_scene/sugar_compositor.py</code> script. </li> <li><b>Composition and animation:</b> Make a tutorial on how to use the scripts in the <code>blender</code> directory and the <code>sugar_scene/sugar_compositor.py</code> class to import composition and animation data into PyTorch and apply it to the SuGaR hybrid representation. </li> <!-- <li><b>Improvement:</b> Implement a simple method to avoid artifacts when reconstructing thin objects with poor coverage/visibility in the training images.</li> </li> --> </ul> </details>

Overview

As we explain in the paper, SuGaR optimization starts with first optimizing a 3D Gaussian Splatting model for 7k iterations with no additional regularization term. Consequently, the current implementation contains a version of the original <a href="https://github.com/graphdeco-inria/gaussian-splatting">3D Gaussian Splatting code</a>, and we built our model as a wrapper of a vanilla 3D Gaussian Splatting model. Please note that, even though this wrapper implementation is convenient for many reasons, it may not be the most optimal one for memory usage.

The full SuGaR pipeline consists of 4 main steps, and an optional one:

  1. Short vanilla 3DGS optimization: optimizing a vanilla 3D Gaussian Splatting model for 7k iterations, in order to let Gaussians position themselves in the scene.
  2. SuGaR optimization: optimizing Gaussians alignment with the surface of the scene.
  3. Mesh extraction: extracting a mesh from the optimized Gaussians.
  4. SuGaR refinement: refining the Gaussians and the mesh together to build a hybrid Mesh+Gaussians representation.
  5. Textured mesh extraction (Optional): extracting a traditional textured mesh from the refined SuGaR model as a tool for visualization, composition and animation in Blender using our <a href="https://github.com/Anttwo/sugar_frosting_blender_addon/">Blender add-on</a>.

We provide a dedicated script for each of these steps, as well as a script train_full_pipeline.py that runs the entire pipeline. We explain how to use this script in the next sections. <br>

<div align="center"><br> <img src="./media/blender/blender_edit.png" alt="blender_edit.png" height="200"/> <img src="./media/examples/attack.gif" alt="attack.gif" height="200"/> <br><b>You can visualize, edit, combine or animate the reconstructed textured meshes in Blender <i>(left)</i> <br>and render the result with SuGaR <i>(right)</i> thanks to our <a href="https://github.com/Anttwo/sugar_frosting_blender_addon/">Blender add-on</a>.</b><br> </div><br>

Please note that the final step, Textured mesh extraction, is optional but is enabled by default in the train_full_pipeline.py script. Indeed, it is very convenient to have a traditional textured mesh for visualization, composition and animation using traditional softwares such as <a href="https://github.com/Anttwo/sugar_frosting_blender_addon/">Blender</a>. If you installed Nvdiffrast as described below, this step should only take a few seconds anyway.

<div align="center"> <b>Hybrid representation (Mesh + Gaussians on the surface)</b><br> <img src="./media/overview/garden_hybrid.png" alt="garden_hybrid.gif" height="135"/> <img src="./media/overview/kitchen_hybrid.png" alt="kitchen_hybrid.gif" height="135"/> <img src="./media/overview/qant03_hybrid.png" alt="qant03_hybrid.gif" height="135"/> <img src="./media/overview/dukemon_hybrid.png" alt="_hybrid.gif" height="135"/><br> <b>Underlying mesh with a traditional colored UV texture</b><br> <img src="./media/overview/garden_texture.png" alt="garden_notex.gif
View on GitHub
GitHub Stars3.3k
CategoryDevelopment
Updated1d ago
Forks287

Languages

C++

Security Score

85/100

Audited on Mar 27, 2026

No findings