TextureSignalProcessing
Gradient-Domain Processing within a Texture Atlas
Install / Use
/learn @mkazhdan/TextureSignalProcessingREADME
<center><h2>Gradient Domain Texture Processing (Version 9.60)</h2></center>
<center>
<a href="#LINKS">links</a>
<a href="#EXECUTABLES">executables</a>
<a href="#USAGE">usage</a>
<a href="#LIBRARY">library</a>
<a href="#COMPILATION">compilation</a>
<a href="#CHANGES">changes</a>
<a href="#SUPPORT">support</a>
</center>
<hr>
This software supports gradient-domain signal processing within a texture atlas. Supported applications include:
<UL>
<LI>(localized) texture smoothing and sharpening
<LI>vector-field visualization akin to line-integral convolution
<LI>computation of single-source geodesics
<LI>simulation of reaction-diffusion following the Gray-Scott model
<LI>dilation of texture into the gutter
<LI>masking of gutter/interior/boundary texels
<LI>solving for the smoothest interpolant within a prescribed subset of texels
</UL>
<hr>
<a name="LINKS"><b>LINKS</b></a><br>
<ul>
<b>Papers:</b>
<a href="http://www.cs.jhu.edu/~misha/MyPapers/SIG18.pdf">[Prada, Kazhdan, Chuang, and Hoppe, 2018]</a>,
<a href="https://en.wikipedia.org/wiki/Line_integral_convolution">[Cabral and Leedom, 1993]</a>,
<a href="https://www.cs.cmu.edu/~kmcrane/Projects/HeatMethod/">[Crane, Weischedel, and Wardetzky, 2013]</a>
<br>
<b>Executables: </b>
<a href="TSP.x64.zip">Win64</a><br>
<b>Source Code:</b>
<a href="TSP.Source.zip">ZIP</a> <a href="https://github.com/mkazhdan/TextureSignalProcessing">GitHub</a><br>
<B>Data:</B>
<A HREF="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/TSP.Data.zip">ZIP</A><br>
<b>Older Versions:</b>
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version9.55/">V9.55</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version9.50/">V9.50</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version9.10/">V9.10</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version9.05/">V9.05</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version9.00/">V9.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version8.00/">V8.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version7.00/">V7.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version6.06/">V6.06</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version6.00/">V6.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version5.01/">V5.01</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version5.00/">V5.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.75/">V4.75</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.60/">V4.60</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.50/">V4.50</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.08/">V4.08</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.07/">V4.07</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.06/">V4.06</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.05/">V4.05</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.03/">V4.03</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.02/">V4.02</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.01/">V4.01</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version4.00/">V4.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version3.00/">V3.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version2.00/">V2.00</a>,
<a href="http://www.cs.jhu.edu/~misha/Code/TextureSignalProcessing/Version1.00/">V1.00</a>
</ul>
<hr>
<a name="EXECUTABLES"><b>EXECUTABLES</b></a><br>
<ul>
These applications support reading in textures meshes in one of two formats:
<UL>
<LI><A href="http://www.cc.gatech.edu/projects/large_models/ply.html">PLY</A> (to support multiple charts, texture is assumed to be encoded at wedges/corners rather than vertices.)
<LI><A HREF="https://www.fileformat.info/format/wavefrontobj/egff.htm">Wavefront OBJ</A>
</UL>
Input textures are assumed to be images in <I>png</I>, <I>jpg</I>, or <I>jpeg</I> format.
<dl>
<details>
<summary>
<font size="+1"><b>TextureFiltering</b></font>:
Supports the (localized) smoothing and sharpening of a texture by solving a screened Poisson equation which gives the signal whose values match the input and whose gradients match the modulated gradients of the input. If no output texture is specified, the executable will launch an interactive viewer that supports local "painting" of gradient modulation values and prescription of a global interpolation weight.<BR>
In the interactive viewer the modulation can be set globally by dragging the slider on the top left.<BR>
The modulation can be set locally by holding the [SHIFT] key down and either dragging with the left mouse button (to smooth) or the right mouse button (to sharpen).
</summary>
<dt><b>--in</b> <<i>input mesh and texture names</i>></dt>
<dd> These two strings specify the the names of the mesh and the texture image.
</dd>
<dt>[<b>--out</b> <<i>output texture</i>>]</dt>
<dd> This string is the name of the file to which the processed texture will be written.</B>
</dd>
<dt>[<b>--outVCycles</b> <<i>output v-cycles</i>>]</dt>
<dd> This integer specifies the number of v-cycles to use if the processed texture is output to a file and a direct solver is not used.</B>
The default value for this parameter is 6.
</dd>
<dt>[<b>--interpolation</b> <<i>interpolation weight</i>>]</dt>
<dd> This floating point values gives the interpolation weight.<BR>
The default value for this parameter is 1000.
</dd>
<dt>[<b>--modulation</b> <<i>gradient modulation</i>>]</dt>
<dd> This floating point values gives the (uniform) gradient modulation.<BR>
The default value for this parameter is 1.
</dd>
</dd><dt>[<b>--jitter</B> <<i>random seed</i>>]</dt>
<dd> If specified, this integer value is used to seed the random number generation for jittering. (This is used to avoid singular situations when mesh vertices fall directly on edges in the texture grid. In such a situation, the executable will issue a warning <B>"Zero row at index ..."</B>.)
</dd>
</dd><dt>[<b>--useDirectSolver</B>]</dt>
<dd> If enabled, this flag specifies that a direct solver should be used (instead of the default multigrid solver).
</dd>
</dd><dt>[<b>--sanityCheck</B>]</dt>
<dd> If enabled, this flag specifies that sanity-checks should be run to confirm that the texture-mapping is valid (e.g. that flipped triangles do not extend beyond the boundaries of the charts) and an exception is thrown if the mapping is found to be invalid.
</dd>
</details>
</dl>
</ul>
<ul>
<dl>
<details>
<summary>
<font size="+1"><b>TextureStitching</b></font>:
Supports the stitching together of multiple (possibly overlapping) textures by solving a screened Poisson equation with value constraints defined by the input texture.
The interactive viewer runs in two modes:
<OL>
<LI> A user specifies a single composite texture and (optionally) a mask file indicating when texels in the composite come from the same source.
In this case gradient constraints are obtained by copying gradients from the composite whenever the two texels defining an edge come from the same source, and setting the gradient constraint to zero along edges coming from different sources. If no mask file is provided, a default mask is created by assigning texels the same color if and only if they are covered by the same chart.<BR>
The viewer shows the stitched texture on the left and the composite texture on the right.
<LI> A user specifies multiple partial texture files and corresponding confidence masks.
In this case gradient constraints are obtained by blending gradients from the different inputs, weighted by confidence, and setting gradients to zero in regions where there are no textures with non-zero confidence.
The viewer shows the stitched texture on the left and a partial texture on the right. The user can selectively replace blended value/gradient constraints with the values/gradients from the partial texture by holding the [SHIFT] key down and dragging over the region to be in-painted.
</OL>
</summary>
<dt><b>--in</b> <<i>input mesh and composite texture</i>></dt>
<dd> These two strings specify the names of the mesh and the texture image.
</dd>
<dt>[<b>--mask</b> <<i>input mask</i>>]</dt>
<dd> This string specifies the name of the mask image.<br>
Black pixels in the mask file should be used to denote regions where the texel value is unkown. (Results may be unpredictable if it is encoded using lossy compression.)
</dd>
<dt>[<b>--out</b> <<i>output texture</i>>]</dt>
<dd> This string is the name of the file to which the stitched texture will be written.</B>
</dd>
<dt>[<b>--outVCycles</b> <<i>output v-cycles</i>>]</dt>
<dd> This integer specifies the number of v-cycles to use if the stitched texture is output to a file and a direct solver is not used.<BR>
The default value for this parameter is 6.
</dd>
<dt>[<b>--interpolation</b> <<i>interpolation weight</i>>]</dt>
<dd> This floating point values gives the interpolation weight.<BR>
The default value for this parameter is 100.
</dd>
<!--
<dt>[<b>--dilateBounaries</b> <<i>dilation radius</i>>]</dt>
<dd> This integer values gives the radius by which the boundaries of the segments should be dilated before stithing is performed.<BR>
The default value for this parameter is -1, indicating no dilation.
</dd>
-->
</dd><dt>[<b>--jitter</B> <<i>random seed</i>>]</dt>
<dd> If specified, this integer value is used to seed the random number generation for jittering. (This is used to avoid singular situations when mesh vertices fall directly on edges in the
Related Skills
node-connect
347.0kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
107.8kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
347.0kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
347.0kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
