SkillAgentSearch skills...

GeoCode

GeoCode maps 3D shapes to a human-interpretable parameter space, allowing to intuitively edit the recovered 3D shapes from a point cloud or sketch input.

Install / Use

/learn @threedle/GeoCode
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

GeoCode: Interpretable Shape Programs [Project Page]

GitHub Pages CGF arXiv

Ofek Pearl, Itai Lang, Yuhua Hu, Raymond A. Yeh, Rana Hanocka

GeoCode: Interpretable Shape Programs

alt GeoCode

The task of crafting procedural programs capable of generating structurally valid 3D shapes easily and intuitively remains an elusive goal in computer vision and graphics. Within the graphics community, generating procedural 3D models has shifted to using node graph systems. They allow the artist to create complex shapes and animations through visual programming. Being a high-level design tool, they made procedural 3D modeling more accessible. However, crafting those node graphs demands expertise and training. We present GeoCode, a novel framework designed to extend an existing node graph system and significantly lower the bar for the creation of new procedural 3D shape programs. Our approach meticulously balances expressiveness and generalization for part-based shapes. We propose a curated set of new geometric building blocks that are expressive and reusable across domains. We showcase three innovative and expressive programs developed through our technique and geometric building blocks. Our programs enforce intricate rules, empowering users to execute intuitive high-level parameter edits that seamlessly propagate throughout the entire shape at a lower level while maintaining its validity. To evaluate the user-friendliness of our geometric building blocks among non-experts, we conducted a user study that demonstrates their ease of use and highlights their applicability across diverse domains. Empirical evidence shows the superior accuracy of GeoCode in inferring and recovering 3D shapes compared to an existing competitor. Furthermore, our method demonstrates superior expressiveness compared to alternatives that utilize coarse primitives. Notably, we illustrate the ability to execute controllable local and global shape manipulations.

<p align="center"> <img src="https://github.com/threedle/GeoCode/releases/download/v.1.0.0/demo_video_chair.gif" width=250 alt="3D shape recovery"/> <img src="https://github.com/threedle/GeoCode/releases/download/v.1.0.0/demo_video_vase.gif" width=250 alt="3D shape recovery"/> <img src="https://github.com/threedle/GeoCode/releases/download/v.1.0.0/demo_video_table.gif" width=250 alt="3D shape recovery"/> </p> <p align="center"> [MARCH 2025 UPDATE] Two new programs created with GeoCode Blender add-on were added, additionally the chair program was recreated using the GeoCode add-on. </p> <p align="center"> <img src="https://github.com/threedle/GeoCode/releases/download/v.1.0.0/demo_video_cabinet.gif" width=250 alt="3D shape recovery"/> <img src="https://github.com/threedle/GeoCode/releases/download/v.1.0.0/demo_video_ceiling_lamp.gif" width=250 alt="3D shape recovery"/> </p> <p align="center"> A demo video of our program is available on our <a href="https://threedle.github.io/GeoCode/">project page</a>. </p>

[MARCH 2025 UPDATE] GeoCode Blender Add-On

alt GeoCode

We provide a Blender add-on that will help non-experts to create procedural programs in Blender. Please check out our GeoCode Blender add-on here:

GeoCode Blender Add-On Download

We made a Google docs guide available (based on our user study):

GeoCode Blender Add-On Guide

Requirements

  • Python 3.8 (or higher)
  • CUDA 11.8 (or higher)
  • GPU, minimum 8 GB ram
  • During training, a machine with 5 CPUs is recommended
  • During visualization and sketch generation, we recommend a setup with multiple GPU nodes, refer to the additional information to run in parallel on all available nodes
  • During test-set evaluation, generation of raw shapes for a new dataset, and during stability metric evaluation, a single node with 20 CPUs is recommended

Running the test-set evaluation using our dataset and saved checkpoint

<p align="center"> <img src="resources/chair_back_frame_mid_y_offset_pct_0_0000_0002.png" alt="3D shape recovery"/> </p>

Installation

Clone and create the Conda environment

git clone https://github.com/threedle/GeoCode.git
cd GeoCode
conda env create -f environment.yml
conda activate geocode
python setup.py install

# Install Blender 3.2 under `~/Blender`
(sudo) chmod +x ./scripts/install_blender4.2.sh
./scripts/install_blender4.2.sh

# Download the dataset (`~/datasets`), checkpoint (`~/models`) and blend file (`~/blends`) of the `chair` domain
python scripts/download_ds.py --domain chair --datasets-dir ~/datasets --models-dir ~/models --blends-dir ~/blends

vase and table domains are also available

Run the test for the chair domain (1 GPU and 20 CPUs setup is recommended)

Run the test for the chair domain using the downloaded checkpoint, make sure the directories match the directories that were used in the download_ds.py step

cd GeoCode
conda activate geocode
python geocode/geocode.py test --blender-exe ~/Blender/blender-4.2.3-linux-x64/blender --blend-file ~/blends/procedural_chair.blend --models-dir ~/models --dataset-dir ~/datasets/ChairDataset --input-type pc sketch --phase test --exp-name exp_geocode_chair

This will generate the results in the following directory structure, in

<datasets-dir>
│
└───ChairDataset
    │
    └───test
        │
        └───results_exp_geocode_chair
            │
            └───barplot                    <-- model accuracy graph
            └───obj_gt                     <-- 3D objects of the ground truth samples
            └───obj_predictions_pc         <-- 3D objects predicted from point cloud input
            └───obj_predictions_sketch     <-- 3D objects predicted from sketch input
            └───yml_gt                     <-- labels of the ground truth objects
            └───yml_predictions_pc         <-- labels of the objects predicted from point cloud input
            └───yml_predictions_sketch     <-- labels of the objects predicted from sketch input

We also provide a way to automatically render the resulting 3D objects. Please note that this step is GPU intensive due to rendering, the use of multiple nodes with GPU is recommended. Please see the additional information for running this in parallel.

cd GeoCode
conda activate geocode
~/Blender/blender-4.2.3-linux-x64/blender ~/blends/procedural_chair.blend -b --python visualize_results/visualize.py -- --dataset-dir ~/datasets/ChairDataset --phase test --exp-name exp_geocode_chair

this will generate the following additional directories under results_exp_geocode_chair:

            ⋮
            └───render_gt                  <-- renders of the ground truth objects
            └───render_predictions_pc      <-- renders of the objects predicted from point cloud input
            └───render_predictions_sketch  <-- renders of the objects predicted from sketch input

Run training on our dataset (1 GPU and 5 CPUs setup with 5GB of memory per CPU is recommended)

Training from a checkpoint or new training is done similarly, and only depends on the existence of a latest.ckpt checkpoint file in the experiment directory (under ~/models in this example). Please note that training using our checkpoints will show a starting epoch of 0.

cd GeoCode
conda activate geocode
python geocode/geocode.py train --models-dir ~/models --dataset-dir ~/datasets/ChairDataset --nepoch=600 --batch_size=33 --input-type pc sketch --exp-name exp_geocode_chair

Inspecting the blend files

Open one of the Blend files using Blender 3.2.

To modify the shape using the parameters and to inspect the Geometry Nodes Program click the "Geometry Node" workspace at the top of the window

alt GeoCode

Then you will see the following screen

alt GeoCode

Additional Information

Logging

For logging during training, we encourage the use of neptune.ai. First open an account and create a project, create the file GeoCode/config/neptune_config.yml with the following content:

neptune:
  api_token: "<TOKEN>"
  project: "<POJECT_PATH>"

Downloading the datasets, blend files, and checkpoint files

When downloading one or more domain using:

python scripts/download_ds.py --domain chair --datasets-dir ~/datasets --models-dir ~/models --blends-dir ~/blends
python scripts/download_ds.py --domain vase --datasets-dir ~/datasets --models-dir ~/models --blends-dir ~/blends
python scripts/download_ds.py --domain table --datasets-dir ~/datasets --models-dir ~/models --blends-dir ~/blends

The resulting directory structure will be (example for the chair domain):

<datasets-dir>
│
└───ChairDataset
    │
    └───recipe.yml
    │
    └───train
    │   └───obj_gt
    │   └───point_cloud_fps
    │   └───point_cloud_random
    │   └───sketches
    │   └───yml_gt
    │   └───yml_gt_normalized
    │
    └───val
    │   └───obj_gt
    │   └───...
    │
    └───test
        └───obj_gt
        
View on GitHub
GitHub Stars413
CategoryEducation
Updated17d ago
Forks22

Languages

Python

Security Score

85/100

Audited on Mar 16, 2026

No findings