Android NDK camera support
* Add native camera video backend for Android
* In the event of a "No buffer available error" wait for the appropriate callback and retry
* Fix stale context when creating a new AndroidCameraCapture
* Add property handling
uEye are cameras from IDS, c.f. https://en.ids-imaging.com/
Supports driver version 4.94 and up currently, since the event system was overhauled there.
Supports setting/getting the properties: fps,width,height
Objc binding
* Initial work on Objective-C wrapper
* Objective-C generator script; update manually generated wrappers
* Add Mat tests
* Core Tests
* Imgproc wrapper generation and tests
* Fixes for Imgcodecs wrapper
* Miscellaneous fixes. Swift build support
* Objective-C wrapper build/install
* Add Swift wrappers for videoio/objdetect/feature2d
* Framework build;iOS support
* Fix toArray functions;Use enum types whenever possible
* Use enum types where possible;prepare test build
* Update test
* Add test runner scripts for iOS and macOS
* Add test scripts and samples
* Build fixes
* Fix build (cmake 3.17.x compatibility)
* Fix warnings
* Fix enum name conflicting handling
* Add support for document generation with Jazzy
* Swift/Native fast accessor functions
* Add Objective-C wrapper for calib3d, dnn, ml, photo and video modules
* Remove IntOut/FloatOut/DoubleOut classes
* Fix iOS default test platform value
* Fix samples
* Revert default framework name to opencv2
* Add converter util functions
* Fix failing test
* Fix whitespace
* Add handling for deprecated methods;fix warnings;define __OPENCV_BUILD
* Suppress cmake warnings
* Reduce severity of "jazzy not found" log message
* Fix incorrect #include of compatibility header in ios.h
* Use explicit returns in subscript/get implementation
* Reduce minimum required cmake version to 3.15 for Objective-C/Swift binding
cap_libv4l depends on an external library (libv4l) yet is still larger
(1966 loc vs 1822 loc).
It was initially introduced copy pasting cap_v4l in order to offload
various color conversions to libv4l.
However nowadays we handle most of the needed color conversions inside
OpenCV. Our own implementation is better tested and (probably) also
better performing. (as it can optionally leverage IPP/ OpenCL)
Currently cap_v4l is better maintained and generally the code is in
better shape. There is however an API
difference in getting unconverted frames:
* on cap_libv4l one need to set `CV_CAP_MODE_GRAY=1` or
`CV_CAP_MODE_YUYV=1`
* on cap_v4l one needs to set `CV_CAP_PROP_CONVERT_RGB=0`
the latter is more flexible though as it also allows accessing undecoded
JPEG images.
fixes#4563
The entire AssetsLibrary framework is deprecated since iOS 8.0. The code
used in the camera example code can use UIKit to save videos to the
camera instead, which allows to avoid linking with PhotoKit instead to
prevent increasing the iOS deployment target.
CMake: Building Dynamic Framework on iOS (#8009)
* Updated python script with dynamic parameter
Updated python script to build static library by default
Updated python script to include bitcode flag
Added bitcode flag to c flags
Fixed directories and targets with static
Bitcode parameter fixed
Fixed script for static library
Fixed parameters in build function
Updated cmake common toolchain
Added changes to OpenCV Utils
Updates to cmake
Added cache internal
Updates to common toolchain
Fixed path in framework destination and added UIKit dependency
Dynamic plist for framework
Lib version removed hardcoded value
Removed trailing whitespace in toolchain
* Removed trailing whitespace
* Fixed typo in comment
* Renamed bitcode variable to bitcodedisabled
* Fixed target device family