doc: add new tutorial motion deblur filter (#12215)

* doc: add new tutorial motion deblur filter

* Update motion_deblur_filter.markdown

a few minor changes
This commit is contained in:
Vlad Karpushin 2018-08-31 21:41:22 +07:00 committed by Vadim Pisarevsky
parent f36790d521
commit 7d3a1657c9
7 changed files with 266 additions and 0 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 556 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

View File

@ -0,0 +1,72 @@
Motion Deblur Filter {#tutorial_motion_deblur_filter}
==========================
Goal
----
In this tutorial you will learn:
- what the PSF of a motion blur image is
- how to restore a motion blur image
Theory
------
For the degradation image model theory and the Wiener filter theory you can refer to the tutorial @ref tutorial_out_of_focus_deblur_filter "Out-of-focus Deblur Filter".
On this page only a linear motion blur distortion is considered. The motion blur image on this page is a real world image. The blur was caused by a moving subject.
### What is the PSF of a motion blur image?
The point spread function (PSF) of a linear motion blur distortion is a line segment. Such a PSF is specified by two parameters: \f$LEN\f$ is the length of the blur and \f$THETA\f$ is the angle of motion.
![Point spread function of a linear motion blur distortion](images/motion_psf.png)
### How to restore a blurred image?
On this page the Wiener filter is used as the restoration filter, for details you can refer to the tutorial @ref tutorial_out_of_focus_deblur_filter "Out-of-focus Deblur Filter".
In order to synthesize the Wiener filter for a motion blur case, it needs to specify the signal-to-noise ratio (\f$SNR\f$), \f$LEN\f$ and \f$THETA\f$ of the PSF.
Source code
-----------
You can find source code in the `samples/cpp/tutorial_code/ImgProc/motion_deblur_filter/motion_deblur_filter.cpp` of the OpenCV source code library.
@include cpp/tutorial_code/ImgProc/motion_deblur_filter/motion_deblur_filter.cpp
Explanation
-----------
A motion blur image recovering algorithm consists of PSF generation, Wiener filter generation and filtering a blurred image in a frequency domain:
@snippet samples/cpp/tutorial_code/ImgProc/motion_deblur_filter/motion_deblur_filter.cpp main
A function calcPSF() forms a PSF according to input parameters \f$LEN\f$ and \f$THETA\f$ (in degrees):
@snippet samples/cpp/tutorial_code/ImgProc/motion_deblur_filter/motion_deblur_filter.cpp calcPSF
A function edgetaper() tapers the input images edges in order to reduce the ringing effect in a restored image:
@snippet samples/cpp/tutorial_code/ImgProc/motion_deblur_filter/motion_deblur_filter.cpp edgetaper
The functions calcWnrFilter(), fftshift() and filter2DFreq() realize an image filtration by a specified PSF in the frequency domain. The functions are copied from the tutorial
@ref tutorial_out_of_focus_deblur_filter "Out-of-focus Deblur Filter".
Result
------
Below you can see the real world image with motion blur distortion. The license plate is not readable on both cars. The red markers show the cars license plate location.
![Motion blur image. The license plates are not readable](images/motion_original.jpg)
Below you can see the restoration result for the black car license plate. The result has been computed with \f$LEN\f$ = 125, \f$THETA\f$ = 0, \f$SNR\f$ = 700.
![The restored image of the black car license plate](images/black_car.jpg)
Below you can see the restoration result for the white car license plate. The result has been computed with \f$LEN\f$ = 78, \f$THETA\f$ = 15, \f$SNR\f$ = 300.
![The restored image of the white car license plate](images/white_car.jpg)
The values of \f$SNR\f$, \f$LEN\f$ and \f$THETA\f$ were selected manually to give the best possible visual result. The \f$THETA\f$ parameter coincides with the cars moving direction, and the
\f$LEN\f$ parameter depends on the cars moving speed.
The result is not perfect, but at least it gives us a hint of the images content. With some effort, the car license plate is now readable.
@note The parameters \f$LEN\f$ and \f$THETA\f$ are the most important. You should adjust \f$LEN\f$ and \f$THETA\f$ first, then \f$SNR\f$.
You can also find a quick video demonstration of a license plate recovering method
[YouTube](https://youtu.be/xSrE0hdhb4o).
@youtube{xSrE0hdhb4o}

View File

@ -320,3 +320,13 @@ In this section you will learn about the image processing (manipulation) functio
*Author:* Karpushin Vladislav
You will learn how to recover an out-of-focus image by Wiener filter.
- @subpage tutorial_motion_deblur_filter
*Languages:* C++
*Compatibility:* \> OpenCV 2.0
*Author:* Karpushin Vladislav
You will learn how to recover an image with motion blur distortion using a Wiener filter.

View File

@ -0,0 +1,184 @@
/**
* @brief You will learn how to recover an image with motion blur distortion using a Wiener filter
* @author Karpushin Vladislav, karpushin@ngs.ru, https://github.com/VladKarpushin
*/
#include <iostream>
#include "opencv2/imgproc.hpp"
#include "opencv2/imgcodecs.hpp"
using namespace cv;
using namespace std;
void help();
void calcPSF(Mat& outputImg, Size filterSize, int len, double theta);
void fftshift(const Mat& inputImg, Mat& outputImg);
void filter2DFreq(const Mat& inputImg, Mat& outputImg, const Mat& H);
void calcWnrFilter(const Mat& input_h_PSF, Mat& output_G, double nsr);
void edgetaper(const Mat& inputImg, Mat& outputImg, double gamma = 5.0, double beta = 0.2);
const String keys =
"{help h usage ? | | print this message }"
"{image |input.png | input image name }"
"{LEN |125 | length of a motion }"
"{THETA |0 | angle of a motion in degrees }"
"{SNR |700 | signal to noise ratio }"
;
int main(int argc, char *argv[])
{
help();
CommandLineParser parser(argc, argv, keys);
if (parser.has("help"))
{
parser.printMessage();
return 0;
}
int LEN = parser.get<int>("LEN");
double THETA = parser.get<double>("THETA");
int snr = parser.get<int>("SNR");
string strInFileName = parser.get<String>("image");
if (!parser.check())
{
parser.printErrors();
return 0;
}
Mat imgIn;
imgIn = imread(strInFileName, IMREAD_GRAYSCALE);
if (imgIn.empty()) //check whether the image is loaded or not
{
cout << "ERROR : Image cannot be loaded..!!" << endl;
return -1;
}
Mat imgOut;
//! [main]
// it needs to process even image only
Rect roi = Rect(0, 0, imgIn.cols & -2, imgIn.rows & -2);
//Hw calculation (start)
Mat Hw, h;
calcPSF(h, roi.size(), LEN, THETA);
calcWnrFilter(h, Hw, 1.0 / double(snr));
//Hw calculation (stop)
imgIn.convertTo(imgIn, CV_32F);
edgetaper(imgIn, imgIn);
// filtering (start)
filter2DFreq(imgIn(roi), imgOut, Hw);
// filtering (stop)
//! [main]
imgOut.convertTo(imgOut, CV_8U);
normalize(imgOut, imgOut, 0, 255, NORM_MINMAX);
imwrite("result.jpg", imgOut);
return 0;
}
void help()
{
cout << "2018-08-14" << endl;
cout << "Motion_deblur_v2" << endl;
cout << "You will learn how to recover an image with motion blur distortion using a Wiener filter" << endl;
}
//! [calcPSF]
void calcPSF(Mat& outputImg, Size filterSize, int len, double theta)
{
Mat h(filterSize, CV_32F, Scalar(0));
Point point(filterSize.width / 2, filterSize.height / 2);
ellipse(h, point, Size(0, cvRound(float(len) / 2.0)), 90.0 - theta, 0, 360, Scalar(255), FILLED);
Scalar summa = sum(h);
outputImg = h / summa[0];
}
//! [calcPSF]
//! [fftshift]
void fftshift(const Mat& inputImg, Mat& outputImg)
{
outputImg = inputImg.clone();
int cx = outputImg.cols / 2;
int cy = outputImg.rows / 2;
Mat q0(outputImg, Rect(0, 0, cx, cy));
Mat q1(outputImg, Rect(cx, 0, cx, cy));
Mat q2(outputImg, Rect(0, cy, cx, cy));
Mat q3(outputImg, Rect(cx, cy, cx, cy));
Mat tmp;
q0.copyTo(tmp);
q3.copyTo(q0);
tmp.copyTo(q3);
q1.copyTo(tmp);
q2.copyTo(q1);
tmp.copyTo(q2);
}
//! [fftshift]
//! [filter2DFreq]
void filter2DFreq(const Mat& inputImg, Mat& outputImg, const Mat& H)
{
Mat planes[2] = { Mat_<float>(inputImg.clone()), Mat::zeros(inputImg.size(), CV_32F) };
Mat complexI;
merge(planes, 2, complexI);
dft(complexI, complexI, DFT_SCALE);
Mat planesH[2] = { Mat_<float>(H.clone()), Mat::zeros(H.size(), CV_32F) };
Mat complexH;
merge(planesH, 2, complexH);
Mat complexIH;
mulSpectrums(complexI, complexH, complexIH, 0);
idft(complexIH, complexIH);
split(complexIH, planes);
outputImg = planes[0];
}
//! [filter2DFreq]
//! [calcWnrFilter]
void calcWnrFilter(const Mat& input_h_PSF, Mat& output_G, double nsr)
{
Mat h_PSF_shifted;
fftshift(input_h_PSF, h_PSF_shifted);
Mat planes[2] = { Mat_<float>(h_PSF_shifted.clone()), Mat::zeros(h_PSF_shifted.size(), CV_32F) };
Mat complexI;
merge(planes, 2, complexI);
dft(complexI, complexI);
split(complexI, planes);
Mat denom;
pow(abs(planes[0]), 2, denom);
denom += nsr;
divide(planes[0], denom, output_G);
}
//! [calcWnrFilter]
//! [edgetaper]
void edgetaper(const Mat& inputImg, Mat& outputImg, double gamma, double beta)
{
int Nx = inputImg.cols;
int Ny = inputImg.rows;
Mat w1(1, Nx, CV_32F, Scalar(0));
Mat w2(Ny, 1, CV_32F, Scalar(0));
float* p1 = w1.ptr<float>(0);
float* p2 = w2.ptr<float>(0);
float dx = float(2.0 * CV_PI / Nx);
float x = float(-CV_PI);
for (int i = 0; i < Nx; i++)
{
p1[i] = float(0.5 * (tanh((x + gamma / 2) / beta) - tanh((x - gamma / 2) / beta)));
x += dx;
}
float dy = float(2.0 * CV_PI / Ny);
float y = float(-CV_PI);
for (int i = 0; i < Ny; i++)
{
p2[i] = float(0.5 * (tanh((y + gamma / 2) / beta) - tanh((y - gamma / 2) / beta)));
y += dy;
}
Mat w = w2 * w1;
multiply(inputImg, w, outputImg);
}
//! [edgetaper]