SkillAgentSearch skills...

Onnx2code

Convert ONNX models to plain C++ code (without dependencies)

Install / Use

/learn @mlomb/Onnx2code
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

onnx2code

Generate plain C++ code for inference of ONNX models without dependencies

This project was made as an alternative to a final exam for the assignment "Computer Organization II". You can read the writeup in docs/TP Final onnx2code.pdf (in Spanish).

Model support

The following models have been tested and work as expected.

| Model | Size | |---|---| | mnist | 26 KB | | Super_Resolution | 240 KB | | squeezenet1.1 | 9 MB | | emotion_ferplus | 34 MB | | inception-v2 | 44 MB | | resnet50-caffe2-v1 | 98 MB | | VGG 16 and VGG 16-bn | 527 MB | | VGG 19 and VGG 19-bn | 548 MB | | VGG 19-caffe2 | 561 MB |

  • Minimum ONNX opset version: 7
  • Quantized models are not supported

Operator support

Only float data type is supported.

| Operator | Attribute support | |---|---| | Add, Div, Mul, Sub | ✅ with broadcasting | | Concat | ✅ with multiple inputs<br/>✅ axis | | Conv | ✅ bias<br/>✅ stride<br/>✅ padding (and auto_pad)<br/>❌ dilations<br/>❌ depthwise (group != 1) | | Sum | ✅ with multiple inputs<br/>❌ with broadcasting | | Relu, Tanh, Sigmoid, Clip | ✅ | | Gemm | ✅ with bias<br/>❌ transpose A<br/>✅ tranpose B<br/>❌ alpha != 1<br/>❌ beta != 1 | | Identity | ✅ | | MaxPool, AveragePool | ✅ stride<br/>✅ padding (and auto_pad)<br/>❌ dilations<br/>❌ storage_order != 0<br/>❌ count_include_pad != 0 | | Softmax | ✅ stride<br/>✅ axis | | Transpose | ✅ perm |

Setting up with Docker

We provide a ready to use Docker image:

docker run --rm -it -v $pwd/mnist.onnx:/app/input.onnx:ro -v $pwd/output:/app/output:rw mlomb/onnx2code:latest --variations=im2col,loop-tiling --checks=3

The command above will generate C++ code for the mnist.onnx model in the output folder.

Setting up locally

Prerequisites

  • gcc (required if checking models)
  • Python 3.10
  • pipenv

Clone and install dependencies with pipenv install.

Run

To generate code from an ONNX model, run the following command inside a pipenv shell:

python -m onnx2code --variation=im2col,loop-tiling mnist.onnx output_folder --checks=3

Related Skills

View on GitHub
GitHub Stars22
CategoryEducation
Updated3mo ago
Forks1

Languages

Python

Security Score

77/100

Audited on Dec 16, 2025

No findings