Commit Graph

68 Commits

Author SHA1 Message Date
Alexander Alekhin
66d7956e67 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-06-15 16:25:11 +00:00
Dmitry Kurtaev
eba696a41e Merge pull request #14792 from dkurt:dnn_ie_min_version_r5
* Remove Inference Engine 2018R3 and 2018R4

* Fix 2018R5
2019-06-14 18:17:02 +03:00
Alexander Alekhin
f3de2b4be7 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-06-05 19:11:52 +03:00
Alexander Alekhin
f355b3505f Merge pull request #14661 from dkurt:ie_deconv_adj 2019-06-03 16:58:17 +00:00
Dmitry Kurtaev
9c0af1f675 Enable more deconvolution layer configurations with IE backend 2019-06-03 08:15:52 +03:00
Alexander Alekhin
43467a2ac7 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-05-28 18:29:48 +00:00
Alexander Alekhin
38a3c1ce6b dnn(test): update test tags for Debug build 2019-05-27 20:12:30 +00:00
Dmitry Kurtaev
44d21e5a79 Enable Slice layer on Inference Engine backend 2019-05-27 16:28:01 +03:00
Alexander Alekhin
e21262deba Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-05-16 04:40:39 +00:00
Dmitry Kurtaev
6389dfe49c Fixed DetectionOutput output blob shape 2019-05-14 15:09:32 +03:00
Alexander Alekhin
e28e3c9491 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-05-01 08:27:45 +00:00
Dmitry Kurtaev
4f6be11c0e Check if Inference Engine networks are fully supported by backend 2019-04-25 11:27:17 +03:00
Alexander Alekhin
4635356435 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-04-13 20:00:54 +00:00
Alexander Alekhin
64629cb94e Merge pull request #12783 from alalek:test_tag 2019-04-12 15:35:41 +00:00
Dmitry Kurtaev
a2bbfa1db5 Enable some tests for Inference Engine 2019R1 2019-04-12 15:21:42 +03:00
Alexander Alekhin
e0841f3d6e dnn(test-tags): add time / memory tags 2019-04-08 19:18:25 +00:00
Alexander Alekhin
4001346a30 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-04-03 19:33:52 +00:00
Alexander Alekhin
8483801eab dnn: use OpenVINO 2019R1 defines 2019-04-03 15:39:47 +03:00
Alexander Alekhin
7442100caa Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-03-29 19:29:36 +00:00
Lubov Batanina
7d3d6bc4e2 Merge pull request #13932 from l-bat:MyriadX_master_dldt
* Fix precision in tests for MyriadX

* Fix ONNX tests

* Add output range in ONNX tests

* Skip tests on Myriad OpenVINO 2018R5

* Add detect MyriadX

* Add detect MyriadX on OpenVINO R5

* Skip tests on Myriad next version of OpenVINO

* dnn(ie): VPU type from environment variable

* dnn(test): validate VPU type

* dnn(test): update DLIE test skip conditions
2019-03-29 16:42:58 +03:00
Alexander Alekhin
8c0b0714e7 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-03-11 19:20:22 +00:00
Alexander Nesterov
74574dfae4 Added optimization fuse 2019-03-05 18:12:03 -01:00
Alexander Alekhin
c3cf35ab63 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-02-26 17:34:42 +03:00
Dmitry Kurtaev
ed710eaa1c Make Inference Engine R3 as a minimal supported version 2019-02-21 09:32:26 +03:00
Alexander Alekhin
f414c16c13 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-02-08 17:18:56 +00:00
Alexander Nesterov
9cbdb48d6d Fix change step 2019-02-07 11:14:20 -01:00
Alexander Alekhin
631b246881 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2019-01-22 18:00:34 +00:00
Dmitry Kurtaev
f0ddf302b2 Move Inference Engine to new API 2019-01-17 14:28:48 +03:00
Alexander Alekhin
1dee705074 Merge branch '3.4' into merge-3.4 2018-12-22 05:40:15 +00:00
Dmitry Kurtaev
59ce1d80a5 Fix dnn tests for Inference Engine R5 2018-12-21 12:33:30 +03:00
Alexander Alekhin
9ff1c39daa dnn: fixup available backends/targets 2018-12-05 19:19:17 +03:00
Maksim Shabunin
fe459c82e5 Merge pull request #13332 from mshabunin:dnn-backends
DNN backends registry (#13332)

* Added dnn backends registry

* dnn: process DLIE/FPGA target
2018-12-05 18:11:45 +03:00
Alexander Alekhin
8f4e5c2fb8 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2018-11-26 15:37:45 +03:00
Dmitry Kurtaev
84ce2cc211 Enable some dnn tests according to the new Intel's Inference Engine release (R4) 2018-11-26 13:02:24 +03:00
Dmitry Kurtaev
2f6f52d644 Fix ONNX's emotion_ferplus model.
Reduce input size for OpenPose tests
2018-11-23 19:00:17 +03:00
Alexander Alekhin
22dbcf98c5 Merge remote-tracking branch 'upstream/3.4' into merge-3.4 2018-11-17 14:17:35 +00:00
Alexander Alekhin
f2bec05e6d Merge pull request #12913 from dkurt:dnn_fix_ie_hyperparams 2018-11-16 18:36:12 +00:00
Dmitry Kurtaev
b5c54e447c Extra hyperparameters for Intel's Inference Engine layers 2018-11-15 20:06:37 +03:00
Alexander Alekhin
96c71dd3d2 dnn: reduce set of ignored warnings 2018-11-15 13:15:59 +03:00
WuZhiwen
6e3ea8b49d Merge pull request #12703 from wzw-intel:vkcom
* dnn: Add a Vulkan based backend

This commit adds a new backend "DNN_BACKEND_VKCOM" and a
new target "DNN_TARGET_VULKAN". VKCOM means vulkan based
computation library.

This backend uses Vulkan API and SPIR-V shaders to do
the inference computation for layers. The layer types
that implemented in DNN_BACKEND_VKCOM include:
Conv, Concat, ReLU, LRN, PriorBox, Softmax, MaxPooling,
AvePooling, Permute

This is just a beginning work for Vulkan in OpenCV DNN,
more layer types will be supported and performance
tuning is on the way.

Signed-off-by: Wu Zhiwen <zhiwen.wu@intel.com>

* dnn/vulkan: Add FindVulkan.cmake to detect Vulkan SDK

In order to build dnn with Vulkan support, need installing
Vulkan SDK and setting environment variable "VULKAN_SDK" and
add "-DWITH_VULKAN=ON" to cmake command.

You can download Vulkan SDK from:
https://vulkan.lunarg.com/sdk/home#linux

For how to install, see
https://vulkan.lunarg.com/doc/sdk/latest/linux/getting_started.html
https://vulkan.lunarg.com/doc/sdk/latest/windows/getting_started.html
https://vulkan.lunarg.com/doc/sdk/latest/mac/getting_started.html
respectively for linux, windows and mac.

To run the vulkan backend, also need installing mesa driver.
On Ubuntu, use this command 'sudo apt-get install mesa-vulkan-drivers'

To test, use command '$BUILD_DIR/bin/opencv_test_dnn --gtest_filter=*VkCom*'

Signed-off-by: Wu Zhiwen <zhiwen.wu@intel.com>

* dnn/Vulkan: dynamically load Vulkan runtime

No compile-time dependency on Vulkan library.
If Vulkan runtime is unavailable, fallback to CPU path.

Use environment "OPENCL_VULKAN_RUNTIME" to specify path to your
own vulkan runtime library.

Signed-off-by: Wu Zhiwen <zhiwen.wu@intel.com>

* dnn/Vulkan: Add a python script to compile GLSL shaders to SPIR-V shaders

The SPIR-V shaders are in format of text-based 32-bit hexadecimal
numbers, and inserted into .cpp files as unsigned int32 array.

* dnn/Vulkan: Put Vulkan headers into 3rdparty directory and some other fixes

Vulkan header files are copied from
https://github.com/KhronosGroup/Vulkan-Docs/tree/master/include/vulkan
to 3rdparty/include

Fix the Copyright declaration issue.

Refine OpenCVDetectVulkan.cmake

* dnn/Vulkan: Add vulkan backend tests into existing ones.

Also fixed some test failures.

- Don't use bool variable as uniform for shader
- Fix dispathed group number beyond max issue
- Bypass "group > 1" convolution. This should be support in future.

* dnn/Vulkan: Fix multiple initialization in one thread.
2018-10-29 17:51:26 +03:00
Alexander Alekhin
f33cbe94dd Merge pull request #12142 from alalek:dnn_ocl_fix_convolution_perf_tests 2018-08-31 13:27:14 +00:00
Dmitry Kurtaev
50bceea038 Include preprocessing nodes to object detection TensorFlow networks (#12211)
* Include preprocessing nodes to object detection TensorFlow networks

* Enable more fusion

* faster_rcnn_resnet50_coco_2018_01_28 test
2018-08-31 15:41:56 +03:00
Alexander Alekhin
c557193b8c dnn(test): use dnnBackendsAndTargets() param generator 2018-08-31 15:11:58 +03:00
Dmitry Kurtaev
3e027df583 Enable more deep learning tests using Intel's Inference Engine backend 2018-08-27 18:37:35 +03:00
Dmitry Kurtaev
ed0e79cb61 Add missing parameter to DetectionOutput layer from Intel's Inference Engine 2018-07-31 11:37:45 +03:00
Alexander Alekhin
452fa3011c dnn(test): drop CV_ENUM for DNNBackend / DNNTarget 2018-07-10 15:12:01 +03:00
Dmitry Kurtaev
019c2f2115 Enable more deep learning tests 2018-07-05 14:23:15 +03:00
Dmitry Kurtaev
b11e22c25b Update Inference Engine tests 2018-06-26 15:38:08 +03:00
Dmitry Kurtaev
2c291bc2fb Enable FastNeuralStyle and OpenFace networks with IE backend 2018-06-09 15:57:12 +03:00
Dmitry Kurtaev
40765c5f8d Enable SSD models from TensorFlow with OpenCL plugin of Intel's Inference Engine 2018-06-08 16:55:21 +03:00