Recover pose from different cameras (version 2)
* add recoverPose for two different cameras
* Address review comments from original PR
* Address new review comments
* Rename private api
Co-authored-by: tompollok <tom.pollok@gmail.com>
Co-authored-by: Zane <zane.huang@mail.utoronto.ca>
Function is validated. Included an update to DISABLED_Calib3d_InitInverseRectificationMap.
Includes updates per input from @alalek and unit test regression # to reflect PR #
Update to initInverseRectificationMap()
* update to initInverseRectificationMap() documentation
* Restructured Calib3d_InitInverseRectificationMap unit test per feedback from alalek
* whitespace
Fixed trailing whitespace
Update to initInverseRectificationMap documentation for clarity
Added test case for initInverseRectificationMap()
Updated documentation.
Fixed whitespace error in docs
Small update to test function
Now passes success_error_level
final update to inverseRectification documentation
* Added CALIB_FIX_FOCAL_LENGTH to fisheye calibration #13450
Sometimes you want to calibrate just the principal point of a camera, or just the distortion coefficients. In this case, you can pass the CALIB_FIX_FOCAL_LENGTH flag to keep Fx and Fy
* Added test for CALIB_FIX_FOCAL_LENGTH option in fisheye callinration.
calib3d(usac): do not crash on empty models
* calib3d(test): regression test for issue 19639
* calib3d(usac): do not crash in setModelParameters()
* calib3d(usac): handle empty models in isModelGood()
Fisheye test has been updated to use new enum cv::fisheye::CALIB_ZERO_DISPARITY and included CV_StaticAssert(...) to ensure cv::CALIB_ZERO_DISPARITY == cv::fisheye::CALIB_ZERO_DISPARITY.
- detect case with infinite loop and raise NoConv exception
- handle such exception
- add support for case with missing `blobDetector` (image contains Point2f array of candidates)
- add regression test
- undone rectification for "failed" detections too
- drop redirectError() usage
Added SQPnP algorithm to SolvePnP
* Added sqpnp
* Fixed test case
* Added fix for duplicate point checking and inverse func reuse
* Changes for 3x speedup
Changed norm method (significant speed increase), changed nearest rotation computation to FOAM
* Added symmetric 3x3 inverse and unrolled loops
* Fixed error with SVD
* Fixed error from with indices
Indices were initialized negative. When nullspace is large, points coplanar, and rotation near 0, indices not changed.
* add findEssentialMat for two different cameras
* added smoke test for the newly added variant of findEssentialMatrix
Co-authored-by: tompollok <tom.pollok@gmail.com>
fix instable fisheye undistortPoints
* remove artefacts when (un)distorting fisheye images with large distortion coefficient values
* fix fisheye undistortion when theta is close to zero
* add fisheye image undistort and distort test
* Fixed type conversion warnings
* fixed trailing whitespace
* Fixed indexing in prefilter
* Initialised prefilter
* Initialised prefilter with value initialisation
* Added TC to trigger different Mem Allocs in BufferBM
* Optimize cases with only needed conditions
added estimateTranslation3D to calib3d/ptsetreg
* added estimateTranslation3D; follows API and implementation structure for estimateAffine3D, but only allows for translation
* void variables in null function to suppress compiler warnings
* added test for estimateTranslation3D
* changed to Matx13d datatype for translation vector in ptsetreg and test; used short license in test
* removed iostream include
* calib3d: code cleanup
Image sharpness, as well as brightness, are a critical parameter for
accuracte camera calibration. For accessing these parameters for
filtering out problematic calibraiton images, this method calculates
edge profiles by traveling from black to white chessboard cell centers.
Based on this, the number of pixels is calculated required to transit
from black to white. This width of the transition area is a good
indication of how sharp the chessboard is imaged and should be below
~3.0 pixels.
Based on this also motion blur can be detectd by comparing sharpness in
vertical and horizontal direction. All unsharp images should be excluded
from calibration as they will corrupt the calibration result. The same
is true for overexposued images due to a none-linear sensor response.
This can be detected by looking at the average cell brightness of the
detected chessboard.