Instant
DNN Inference with CPU, C++, ONNX support: Instant
Install / Use
/learn @okdshin/InstantREADME
WARNING: Instant is deprecated. Please use Menoh instead.
WARNING: Instant is deprecated. Please use Menoh instead.
WARNING: Instant is deprecated. Please use Menoh instead.
Instant
WARNING: Instant is deprecated. Please use Menoh instead.
Instant is DNN inference library written in C++.
Instant is released under MIT Licence.
Goal
- DNN Inference with CPU
- C++
- ONNX support
- Easy to use.
Requirement
- MKL-DNN Library
- ProtocolBuffers
Build
Execute below commands in root directory.
sh retrieve_data.sh
mkdir build && cd build
cmake ..
make
Installation
Execute below command in build directory created at Build section.
make install
Run VGG16 example
Execute below command in root directory.
./example/vgg16_example
Current supported nodes
- Conv (2D)
- Relu
- MaxPool
- Reshape (nchw -> nc)
- FC
- Dropout
- Softmax
Related Skills
openhue
339.3kControl Philips Hue lights and scenes via the OpenHue CLI.
sag
339.3kElevenLabs text-to-speech with mac-style say UX.
weather
339.3kGet current weather and forecasts via wttr.in or Open-Meteo
tweakcc
1.5kCustomize Claude Code's system prompts, create custom toolsets, input pattern highlighters, themes/thinking verbs/spinners, customize input box & user message styling, support AGENTS.md, unlock private/unreleased features, and much more. Supports both native/npm installs on all platforms.
