Algovivo
An energy-based formulation for soft-bodied virtual creatures
Install / Use
/learn @juniorrojas/AlgovivoREADME
algovivo
<p> <a href="https://github.com/juniorrojas/algovivo/actions/workflows/test.yml"> <img src="https://github.com/juniorrojas/algovivo/actions/workflows/test.yml/badge.svg" alt="Test"> </a> <a href="https://deepwiki.com/juniorrojas/algovivo"> <img src="https://deepwiki.com/badge.svg" alt="Ask DeepWiki"> </a> </p>An energy-based formulation for soft-bodied virtual creatures.
interactive demo
<a href="https://juniorrojas.com/algovivo"> <img src="media/locomotion.gif" width="250px"> </a>Instead of implementing simulations using explicit position update rules and manually derived force functions, we can implement simulations using gradient-based optimization on differentiable energy functions and compute forces and other derivatives using automatic differentiation. Automatic differentiation can be used for potential energy minimization and numerical integration.
This repository implements six energy functions: neo-Hookean triangles, controllable muscles, gravity, terrain collision, friction, and inertia (for backward Euler integration). The energy functions are implemented in C++ (with some parts automatically generated from Python) and differentiated with Enzyme. Additional functionality, including the optimization loop, is implemented in C++, compiled to WebAssembly, and wrapped as a JavaScript library.
quick start
You can create a simple simulation with one triangle and two muscles, where one muscle is controlled by a periodic signal, with the following HTML code.
<img src="media/periodic.gif" width="250px"><!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
</head>
<body>
<script type="module">
import algovivo from "https://cdn.jsdelivr.net/gh/juniorrojas/algovivo@ae28f9c/build/algovivo.min.mjs";
async function loadWasm() {
const response = await fetch("https://cdn.jsdelivr.net/gh/juniorrojas/algovivo@ae28f9c/build/algovivo.wasm");
const wasm = await WebAssembly.instantiateStreaming(response);
return wasm.instance;
}
async function main() {
const system = new algovivo.System({
wasmInstance: await loadWasm()
});
system.set({
pos: [
[0, 0],
[2, 0],
[1, 1]
],
triangles: [
[0, 1, 2]
],
muscles: [
[0, 2],
[1, 2]
]
});
const viewport = new algovivo.SystemViewport({ system });
document.body.appendChild(viewport.domElement);
viewport.render();
let t = 0;
setInterval(() => {
system.a.set([
1,
0.2 + 0.8 * (Math.cos(t * 0.1) * 0.5 + 0.5)
]);
t++;
system.step();
viewport.render();
}, 1000 / 30);
}
main();
</script>
</body>
</html>
The code above imports the ES6 module algovivo.min.mjs and loads the compiled WASM algovivo.wasm from jsDelivr. To serve these files from your own server, you can download them from the build branch.
muscle commands
Muscle commands can be specified with system.a.set([...]). The array length must match the number of muscles. A value of 1 means that the muscle is relaxed and wants to keep its original rest length. Values less than 1 indicate that the muscle wants to contract to some fraction of its original rest length.
| system.a.set([0.3, 1]) | system.a.set([1, 0.3]) | system.a.set([0.3, 0.3]) |
| ------------- |-------------| -----|
| <div align="center"><img src="media/muscle-contract-left.png" width="140px"></div> | <div align="center"><img src="media/muscle-contract-right.png" width="140px"></div> | <div align="center"><img src="media/muscle-contract-both.png" width="140px"></div> |
This is achieved using an action-dependent potential energy function for each muscle.
$$ E(x, a) = \frac{k}{2} \left(\frac{l(x)}{a\ l_0} - 1\right)^2 $$
More details about this and other energy functions used in the simulation can be found here.
neural controller
Instead of manually scripting muscle commands, a neural controller can map proprioceptive signals to muscle control signals to produce locomotion. The example below loads a mesh and a pretrained controller included in this repository. The controller used here is an MLP that takes as input vertex positions and velocities projected onto a local frame, as shown here.
<img src="media/locomotion.gif" width="250px"><!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
</head>
<body>
<script type="module">
import algovivo from "https://cdn.jsdelivr.net/gh/juniorrojas/algovivo@ae28f9c/build/algovivo.min.mjs";
async function loadWasm() {
const response = await fetch("https://cdn.jsdelivr.net/gh/juniorrojas/algovivo@ae28f9c/build/algovivo.wasm");
const wasm = await WebAssembly.instantiateStreaming(response);
return wasm.instance;
}
async function main() {
const meshData = await (await fetch("https://cdn.jsdelivr.net/gh/juniorrojas/algovivo@a5e8c73/demo/public/data/biped/mesh.json")).json();
const policyData = await (await fetch("https://cdn.jsdelivr.net/gh/juniorrojas/algovivo@a5e8c73/demo/public/data/biped/policy.json")).json();
const system = new algovivo.System({
wasmInstance: await loadWasm()
});
system.set(meshData);
const policy = new algovivo.nn.MLPPolicy({ system, active: true });
policy.loadData(policyData);
const viewport = new algovivo.SystemViewport({
system,
sortedVertexIds: meshData.sorted_vertex_ids,
vertexDepths: meshData.depth
});
document.body.appendChild(viewport.domElement);
viewport.render();
setInterval(() => {
policy.step();
system.step();
viewport.render();
}, 1000 / 30);
}
main();
</script>
</body>
</html>
build from source
build JS
npm ci
npm run build
build WASM
python codegen/codegen_csrc.py && \
docker run \
--user $(id -u):$(id -g) \
-v $(pwd):/workspace \
-w /workspace \
ghcr.io/juniorrojas/algovivo/llvm18-enzyme:latest \
./build.sh
citation
@proceedings{10.1162/isal_a_00748,
author = {Rojas, Junior},
title = "{Energy-Based Models for Virtual Creatures}",
volume = {ALIFE 2024: Proceedings of the 2024 Artificial Life Conference},
series = {Artificial Life Conference Proceedings},
pages = {30},
year = {2024},
month = {07},
doi = {10.1162/isal_a_00748},
url = {https://doi.org/10.1162/isal\_a\_00748},
eprint = {https://direct.mit.edu/isal/proceedings-pdf/isal2024/36/30/2461221/isal\_a\_00748.pdf},
}
Related Skills
node-connect
343.1kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
90.0kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
343.1kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
343.1kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
