ncnn is a high-performance neural network inference framework optimized for the mobile platform
-
Updated
Nov 20, 2018 - C
ncnn is a high-performance neural network inference framework optimized for the mobile platform
C++ example applications for deploying deep learning models using frameworks like PyTorch, Tensorflow, ONNX, NCNN.
C++ example applications for deploying deep learning models using frameworks like PyTorch, Tensorflow, ONNX, NCNN.
Set up environment to experiment with ML libs and NCNN in C++
MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.
ncnn is a high-performance neural network inference framework optimized for the mobile platform
NCNN models for chaiNNer (Efenstor's mixes)
Multiprocess batch processor for G'MIC and NCNN-Vulkan CLI tools
Add a description, image, and links to the ncnn topic page so that developers can more easily learn about it.
To associate your repository with the ncnn topic, visit your repo's landing page and select "manage topics."