opencv/modules/dnn
Yuantao Feng accf200408
Merge pull request #25238 from fengyuentau:optimized_const
dnn: avoid const layer forwarding in layer norm layer and attention layer #25238

While profiling ViTs with dnn, I found `ConstLayer` can take a proportion of the inference time, which is weird. This comes from the data copy during the inference of `ConstLayer`. There is a chance that we can improve the efficiency of data copying but the easiest and most convenient way is to avoid `ConstLayer`. This PR change the way how we handle constants in layer normalization layer and attention layer, which is storing in the layer blobs instead of making constant layers for them.

Checklists:

- [x] Backend compatibility in layer normalization layer.

### Pull Request Readiness Checklist

See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request

- [x] I agree to contribute to the project under Apache 2 License.
- [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or another license that is incompatible with OpenCV
- [x] The PR is proposed to the proper branch
- [x] There is a reference to the original bug report and related work
- [x] There is accuracy test, performance test and test data in opencv_extra repository, if applicable
      Patch to opencv_extra has the same branch name.
- [x] The feature is well documented and sample code can be built with the project CMake
2024-03-26 15:09:51 +03:00
..
cmake dnn: plugin support for OpenVINO 2022-10-07 16:57:31 +00:00
include/opencv2 Merge pull request #24773 from tailsu:sd/pathlike 2024-01-12 16:23:05 +03:00
misc Merge pull request #24539 from LaurentBerger:blobrecttoimage 2023-12-19 20:00:04 +03:00
perf Fix proto and weights mess in dnn performance tests. 2024-02-07 09:16:09 +03:00
src Merge pull request #25238 from fengyuentau:optimized_const 2024-03-26 15:09:51 +03:00
test Merge pull request #25181 from dkurt:release_conv_weights 2024-03-25 09:03:28 +03:00
CMakeLists.txt build: first class cuda support 2023-12-26 09:39:18 +03:00