meta-intel/dynamic-layers/openembedded-layer/recipes-support/opencv/files
Chin Huat Ang 096598691d dldt-inference-engine: add recipe
This recipe builds the inference engine from opencv/dldt 2019 R1.1
release.

OpenVINO™ toolkit, short for Open Visual Inference and Neural network
Optimization toolkit, provides developers with improved neural network
performance on a variety of Intel® processors and helps further unlock
cost-effective, real-time vision applications.

The toolkit enables deep learning inference and easy heterogeneous
execution across multiple Intel® platforms (CPU, Intel® Processor Graphics)—providing
implementations across cloud architectures to edge device.

For more details, see:
https://01.org/openvinotoolkit

The recipe needs components from meta-oe so move it to
dynamic-layers/openembedded-layer. GPU plugin support needs intel-compute-runtime
which can be built by including clang layer in the mix as well.

CPU and GPU plugins have been sanity tested to work using
classification_sample. Further fine-tuning is still needed to improve
the performance.

Original patch by Anuj Mittal.

Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com>
Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
2019-09-28 17:18:30 +08:00
..
0001-disable-tests.patch dldt-inference-engine: add recipe 2019-09-28 17:18:30 +08:00
0001-disable-werror.patch dldt-inference-engine: add recipe 2019-09-28 17:18:30 +08:00
0001-fix-openmp-checking.patch dldt-inference-engine: add recipe 2019-09-28 17:18:30 +08:00
0001-Supply-firmware-at-build-time.patch dldt-inference-engine: add recipe 2019-09-28 17:18:30 +08:00
0001-use-provided-paths.patch dldt-inference-engine: add recipe 2019-09-28 17:18:30 +08:00
0002-use-ade-and-pugixml-from-system.patch dldt-inference-engine: add recipe 2019-09-28 17:18:30 +08:00
0007-Install-sample-apps-and-format_reader-library.patch dldt-inference-engine: add recipe 2019-09-28 17:18:30 +08:00