opencv/doc/tutorials/dnn/dnn_openvino/dnn_openvino.markdown
Myron Rodrigues 344f8c6400
Merge pull request #27363 from MRo47:openvino-npu-support
Feature: Add OpenVINO NPU support #27363

## Why
- OpenVINO now supports inference on integrated NPU devices in intel's Core Ultra series processors.
- Sometimes as fast as GPU, but should use considerably less power.

## How
- The NPU plugin is now available as "NPU" in openvino `ov::Core::get_available_devices()`.
- Removed the guards and checks for NPU in available targets for Inference Engine backend.

## Test example

### Pre-requisites
- Intel [Core Ultra series processor](https://www.intel.com/content/www/us/en/products/details/processors/core-ultra/edge.html#tab-blade-1-0)
- [Intel NPU driver](https://github.com/intel/linux-npu-driver/releases)
- OpenVINO 2023.3.0+ (Tested on 2025.1.0)

### Example
```cpp
#include <opencv2/dnn.hpp>
#include <iostream>

int main(){
    cv::dnn::Net net = cv::dnn::readNet("../yolov8s-openvino/yolov8s.xml", "../yolov8s-openvino/yolov8s.bin");
    cv::Size net_input_shape = cv::Size(640, 480);
    std::cout << "Setting backend to DNN_BACKEND_INFERENCE_ENGINE and target to DNN_TARGET_NPU" << std::endl;
    net.setPreferableBackend(cv::dnn::DNN_BACKEND_INFERENCE_ENGINE);
    net.setPreferableTarget(cv::dnn::DNN_TARGET_NPU);

    cv::Mat image(net_input_shape, CV_8UC3);
    cv::randu(image, cv::Scalar(0, 0, 0), cv::Scalar(255, 255, 255));
    cv::Mat blob = cv::dnn::blobFromImage(
        image, 1, net_input_shape, cv::Scalar(0, 0, 0), true, false, CV_32F);
    net.setInput(blob);
    std::cout << "Running forward" << std::endl;
    cv::Mat result = net.forward();
    std::cout << "Output shape: " << result.size << std::endl; // Output shape: 1 x 84 x 6300
}
```

model files [here](https://limewire.com/d/bPgiA#BhUeSTBnMc)

docker image used to build opencv: [ghcr.io/mro47/opencv-builder](https://github.com/MRo47/opencv-builder/blob/main/Dockerfile)

Closes #26240

### Pull Request Readiness Checklist

See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request

- [x] I agree to contribute to the project under Apache 2 License.
- [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or another license that is incompatible with OpenCV
- [x] The PR is proposed to the proper branch
- [x] There is a reference to the original bug report and related work
- [ ] There is accuracy test, performance test and test data in opencv_extra repository, if applicable
      Patch to opencv_extra has the same branch name.
- [ ] The feature is well documented and sample code can be built with the project CMake
2025-05-27 14:13:49 +03:00

2.9 KiB

OpenCV usage with OpenVINO

@prev_tutorial{tutorial_dnn_halide_scheduling} @next_tutorial{tutorial_dnn_yolo}

Original author Aleksandr Voron
Compatibility OpenCV == 4.x

This tutorial provides OpenCV installation guidelines how to use OpenCV with OpenVINO.

Since 2021.1.1 release OpenVINO does not provide pre-built OpenCV. The change does not affect you if you are using OpenVINO runtime directly or OpenVINO samples: it does not have a strong dependency to OpenCV. However, if you are using Open Model Zoo demos or OpenVINO runtime as OpenCV DNN backend you need to get the OpenCV build.

There are 2 approaches how to get OpenCV:

  • Install pre-built OpenCV from another sources: system repositories, pip, conda, homebrew. Generic pre-built OpenCV package may have several limitations:
    • OpenCV version may be out-of-date
    • OpenCV may not contain G-API module with enabled OpenVINO support (e.g. some OMZ demos use G-API functionality)
    • OpenCV may not be optimized for modern hardware (default builds need to cover wide range of hardware)
    • OpenCV may not support Intel TBB, Intel Media SDK
    • OpenCV DNN module may not use OpenVINO as an inference backend
  • Build OpenCV from source code against specific version of OpenVINO. This approach solves the limitations mentioned above.

The instruction how to follow both approaches is provided in OpenCV wiki.

Supported targets

OpenVINO backend (DNN_BACKEND_INFERENCE_ENGINE) supports the following targets:

  • DNN_TARGET_CPU: Runs on the CPU, no additional dependencies required.
  • DNN_TARGET_OPENCL, DNN_TARGET_OPENCL_FP16: Runs on the iGPU, requires OpenCL drivers. Install intel-opencl-icd on Ubuntu.
  • DNN_TARGET_MYRIAD: Runs on Intel® VPU like the Neural Compute Stick, to set up see.
  • DNN_TARGET_HDDL: Runs on the Intel® Movidius™ Myriad™ X High Density Deep Learning VPU, for details see.
  • DNN_TARGET_FPGA: Runs on Intel® Altera® series FPGAs see.
  • DNN_TARGET_NPU: Runs on the integrated Intel® AI Boost processor, requires Linux drivers OR Windows drivers.