DracoPy
Python wrapper for Google's Draco mesh compression library.
Install / Use
/learn @seung-lab/DracoPyREADME
DracoPy
import os
import DracoPy
with open('bunny.drc', 'rb') as draco_file:
mesh = DracoPy.decode(draco_file.read())
print(f"number of points: {len(mesh.points)}")
print(f"number of faces: {len(mesh.faces)}")
print(f"number of normals: {len(mesh.normals)}")
# Note: If mesh.points is an integer numpy array,
# it will be encoded as an integer attribute. Otherwise,
# it will be encoded as floating point.
binary = DracoPy.encode(mesh.points, mesh.faces)
with open('bunny_test.drc', 'wb') as test_file:
test_file.write(encoding_test)
# If faces is omitted, DracoPy will encode a point cloud
binary = Dracopy.encode(mesh.points)
# Options for encoding:
binary = Dracopy.encode(
mesh.points, faces=mesh.faces,
quantization_bits=14, compression_level=1,
quantization_range=-1, quantization_origin=None,
create_metadata=False, preserve_order=False,
colors=mesh.colors
)
DracoPy is a Python wrapper for Google's Draco mesh compression library.
Installation
Binary wheels are available for users with Python >= 3.6 and pip >= 20.
Installation from source requires Python >= 3.6, pip >= 10, and a C++ compiler that is fully compatible with C++11.
It supports Linux, OS X, and Windows. Numpy is required.
pip install DracoPy
Acknowledgements
We graciously thank The Stanford 3D Scanning Repository for providing the Stanford Bunny test model.
https://graphics.stanford.edu/data/3Dscanrep/
Related Skills
node-connect
337.7kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
83.3kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
83.3kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
337.7kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
