High-efficiency floating-point neural network inference operators for mobile, server, and Web
-
Updated
Nov 20, 2025 - C
High-efficiency floating-point neural network inference operators for mobile, server, and Web
Embedded and mobile deep learning research resources
NCNN Framework: High-performance neural network inference for mobile, embedded, and edge AI deployment.
Mobile AI: iOS CoreML, Android TFLite, on-device inference, ONNX, TensorRT, and ML deployment for smartphones.
Open source mobile deep learning model for COVID-19 detection with POCUS
Add a description, image, and links to the mobile-inference topic page so that developers can more easily learn about it.
To associate your repository with the mobile-inference topic, visit your repo's landing page and select "manage topics."