revise default proto to match the filename in documentations
fix a bug
beautify python codes
fix bug
beautify codes
add test samples with larger/smaller size
remove unless code
using bytearray without creating tmp file
remove useless codes
Improving DaSiamRPN tracker sample
* changed layerBlobs in dnn.cpp and added DaSiamRPN tracker
* Improving DaSiamRPN tracker sample
* Docs fix
* Removed outdated changes
* Trying to reinitialize tracker without reloading models. Worked with LaSOT-based benchmark with reinit rate=250 frames
* Trying to reverse changes
* Moving the model in the constructor
* Fixing some issues with names
* Variable name changed
* Reverse parser arguments changes
* Add a FLANN example showing how to search a query image in a dataset
* Clean: remove warning
* Replace dependency to boost::filesystem by calls to core/utils/filesystem
* Wait for escape key to exit
* Add an example of binary descriptors support
* Add program options for saving and loading the flann structure
* Fix warnings on Win64
* Fix warnings on 3.4 branch still relying on C++03
* Add ctor to img_info structure
* Comments modification
* * Demo file of FLANN moved and renamed
* Fix distances type when using binary vectors in the FLANN example
* Rename FLANN example file
* Remove dependency of the flann example to opencv_contrib's SURF.
* Remove mention of FLANN and other descriptors that aimed at giving hint on the other options
* Cleaner program options management
* Make waitKey usage minimal in FLANN example
* Fix the conditions order
* Use cv::Ptr
* hopefully, eliminated compile warnings, errors, as well as failure in one test
* * fixed a few typos
* decreased buffer size in some cases
* added more optimal im2row branch in the case of 1x1 convolutions
* tuned fastConv to reduce the number of passes over arrays
backport of commit 77b01deb80
add relu option
add relu as activation option in darknet
simplify the setParams if-else ladder
add relu as activation option in darknet
correct activation_param type
format
format
add relu as activation option in darknet
spacing
spacing
add relu as activation option in darknet
* Possibility to set more than one tree for the hierarchical KMeans (default is still 1 tree).
This particularly improves NN retrieval results with binary vectors, allowing better quality
compared to LSH for similar processing time when speed is the criterium.
* Add explanations on the FLANN's hierarchical KMeans for binary data.
DNN: OpenCL/slice update
* dnn(ocl/slice): make slice kernel VTune friendly
- more unique names
- inline code of copy functions
* dnn(ocl/slice): prefer to spawn more work groups
- even in case with 1D copy
- perf improvement up to 2x of kernel time (due to changed configuration 128x1x1 => 128x32x1)
* dnn(ocl/slice): cache kernel exec info
* Implement ASIFT in C++
* '>>' should be '> >' within a nested template
* add a sample for asift usage
* bugfix empty keypoints cause crash
* simpler initialization for mask
* suppress the number of lines
* correct tex document
* type casting
* add descriptorsize for asift
* smaller testdata for asift
* more smaller test data
* add OpenCV short license header
libjasper has recently changed `jas_matrix_get` from a macro to an inline function
(389951d071 in https://github.com/jasper-software/jasper), causing the build to fail.