SkillAgentSearch skills...

Instant

DNN Inference with CPU, C++, ONNX support: Instant

Install / Use

/learn @okdshin/Instant
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

WARNING: Instant is deprecated. Please use Menoh instead.

WARNING: Instant is deprecated. Please use Menoh instead.

WARNING: Instant is deprecated. Please use Menoh instead.

Instant

Build Status

WARNING: Instant is deprecated. Please use Menoh instead.

Instant is DNN inference library written in C++.

Instant is released under MIT Licence.

Goal

  • DNN Inference with CPU
  • C++
  • ONNX support
  • Easy to use.

Requirement

  • MKL-DNN Library
  • ProtocolBuffers

Build

Execute below commands in root directory.

sh retrieve_data.sh
mkdir build && cd build
cmake ..
make

Installation

Execute below command in build directory created at Build section.

make install

Run VGG16 example

Execute below command in root directory.

./example/vgg16_example

Current supported nodes

  • Conv (2D)
  • Relu
  • MaxPool
  • Reshape (nchw -> nc)
  • FC
  • Dropout
  • Softmax

Related Skills

View on GitHub
GitHub Stars56
CategoryCustomer
Updated2y ago
Forks3

Languages

C++

Security Score

65/100

Audited on Jan 4, 2024

No findings