Extending template_matching tutorial with Java ()

* Extending template_matching tutorial with Java

* adding mask to java version of the tutorial

* adding the python toggle and code

* updating table of content

* adding py and java to table of content

* adding mask to python

* going back to markdown with duplicated text

* non duplicated text
This commit is contained in:
Cartucho 2017-05-11 22:42:04 +01:00 committed by Maksim Shabunin
parent 3b669149d2
commit 2055bcc807
6 changed files with 473 additions and 97 deletions
doc
samples
cpp/tutorial_code/Histograms_Matching
java/tutorial_code/ImgProc/tutorial_template_matching
python/tutorial_code/imgProc/match_template

View File

@ -67,7 +67,7 @@ $("h2").each(function() {
$smallerHeadings = $(this).nextUntil("h2").filter("h3").add($(this).nextUntil("h2").find("h3"));
if ($smallerHeadings.length) {
$smallerHeadings.each(function() {
var $elements = $(this).nextUntil("h3").filter("div.newInnerHTML");
var $elements = $(this).nextUntil("h2,h3").filter("div.newInnerHTML");
buttonsToAdd($elements, $(this), "h3");
});
} else {

View File

@ -1,14 +1,17 @@
Template Matching {#tutorial_template_matching}
=================
@prev_tutorial{tutorial_back_projection}
@next_tutorial{tutorial_find_contours}
Goal
----
In this tutorial you will learn how to:
- Use the OpenCV function @ref cv::matchTemplate to search for matches between an image patch and
- Use the OpenCV function **matchTemplate()** to search for matches between an image patch and
an input image
- Use the OpenCV function @ref cv::minMaxLoc to find the maximum and minimum values (as well as
- Use the OpenCV function **minMaxLoc()** to find the maximum and minimum values (as well as
their positions) in a given array.
Theory
@ -42,7 +45,7 @@ that should be used to find the match.
- By **sliding**, we mean moving the patch one pixel at a time (left to right, up to down). At
each location, a metric is calculated so it represents how "good" or "bad" the match at that
location is (or how similar the patch is to that particular area of the source image).
- For each location of **T** over **I**, you *store* the metric in the *result matrix* **(R)**.
- For each location of **T** over **I**, you *store* the metric in the *result matrix* **R**.
Each location \f$(x,y)\f$ in **R** contains the match metric:
![](images/Template_Matching_Template_Theory_Result.jpg)
@ -51,9 +54,8 @@ that should be used to find the match.
The brightest locations indicate the highest matches. As you can see, the location marked by the
red circle is probably the one with the highest value, so that location (the rectangle formed by
that point as a corner and width and height equal to the patch image) is considered the match.
- In practice, we use the function @ref cv::minMaxLoc to locate the highest value (or lower,
depending of the type of matching method) in the *R* matrix.
- In practice, we locate the highest value (or lower, depending of the type of matching method) in
the *R* matrix, using the function **minMaxLoc()**
### How does the mask work?
- If masking is needed for the match, three components are required:
@ -81,7 +83,7 @@ that should be used to find the match.
### Which are the matching methods available in OpenCV?
Good question. OpenCV implements Template matching in the function @ref cv::matchTemplate . The
Good question. OpenCV implements Template matching in the function **matchTemplate()**. The
available methods are 6:
-# **method=CV_TM_SQDIFF**
@ -117,119 +119,176 @@ Code
- **What does this program do?**
- Loads an input image, an image patch (*template*), and optionally a mask
- Perform a template matching procedure by using the OpenCV function @ref cv::matchTemplate
- Perform a template matching procedure by using the OpenCV function **matchTemplate()**
with any of the 6 matching methods described before. The user can choose the method by
entering its selection in the Trackbar. If a mask is supplied, it will only be used for
the methods that support masking
- Normalize the output of the matching procedure
- Localize the location with higher matching probability
- Draw a rectangle around the area corresponding to the highest match
@add_toggle_cpp
- **Downloadable code**: Click
[here](https://github.com/opencv/opencv/tree/master/samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp)
- **Code at glance:**
@include samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp
@end_toggle
@add_toggle_java
- **Downloadable code**: Click
[here](https://github.com/opencv/opencv/tree/master/samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java)
- **Code at glance:**
@include samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java
@end_toggle
@add_toggle_python
- **Downloadable code**: Click
[here](https://github.com/opencv/opencv/tree/master/samples/python/tutorial_code/imgProc/match_template/match_template.py)
- **Code at glance:**
@include samples/python/tutorial_code/imgProc/match_template/match_template.py
@end_toggle
Explanation
-----------
-# Declare some global variables, such as the image, template and result matrices, as well as the
- Declare some global variables, such as the image, template and result matrices, as well as the
match method and the window names:
@code{.cpp}
Mat img; Mat templ; Mat result;
char* image_window = "Source Image";
char* result_window = "Result window";
int match_method;
int max_Trackbar = 5;
@endcode
-# Load the source image, template, and optionally, if supported for the matching method, a mask:
@code{.cpp}
bool method_accepts_mask = (CV_TM_SQDIFF == match_method || match_method == CV_TM_CCORR_NORMED);
if (use_mask && method_accepts_mask)
{ matchTemplate( img, templ, result, match_method, mask); }
else
{ matchTemplate( img, templ, result, match_method); }
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp declare
@end_toggle
@endcode
-# Create the windows to show the results:
@code{.cpp}
namedWindow( image_window, WINDOW_AUTOSIZE );
namedWindow( result_window, WINDOW_AUTOSIZE );
@endcode
-# Create the Trackbar to enter the kind of matching method to be used. When a change is detected
the callback function **MatchingMethod** is called.
@code{.cpp}
char* trackbar_label = "Method: \n 0: SQDIFF \n 1: SQDIFF NORMED \n 2: TM CCORR \n 3: TM CCORR NORMED \n 4: TM COEFF \n 5: TM COEFF NORMED";
createTrackbar( trackbar_label, image_window, &match_method, max_Trackbar, MatchingMethod );
@endcode
-# Wait until user exits the program.
@code{.cpp}
waitKey(0);
return 0;
@endcode
-# Let's check out the callback function. First, it makes a copy of the source image:
@code{.cpp}
Mat img_display;
img.copyTo( img_display );
@endcode
-# Next, it creates the result matrix that will store the matching results for each template
location. Observe in detail the size of the result matrix (which matches all possible locations
for it)
@code{.cpp}
int result_cols = img.cols - templ.cols + 1;
int result_rows = img.rows - templ.rows + 1;
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java declare
@end_toggle
result.create( result_rows, result_cols, CV_32FC1 );
@endcode
-# Perform the template matching operation:
@code{.cpp}
bool method_accepts_mask = (CV_TM_SQDIFF == match_method || match_method == CV_TM_CCORR_NORMED);
if (use_mask && method_accepts_mask)
{ matchTemplate( img, templ, result, match_method, mask); }
else
{ matchTemplate( img, templ, result, match_method); }
@endcode
the arguments are naturally the input image **I**, the template **T**, the result **R**, the
match_method (given by the Trackbar), and optionally the mask image **M**
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py global_variables
@end_toggle
-# We normalize the results:
@code{.cpp}
normalize( result, result, 0, 1, NORM_MINMAX, -1, Mat() );
@endcode
-# We localize the minimum and maximum values in the result matrix **R** by using @ref
cv::minMaxLoc .
@code{.cpp}
double minVal; double maxVal; Point minLoc; Point maxLoc;
Point matchLoc;
- Load the source image, template, and optionally, if supported for the matching method, a mask:
minMaxLoc( result, &minVal, &maxVal, &minLoc, &maxLoc, Mat() );
@endcode
the function calls as arguments:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp load_image
@end_toggle
- **result:** The source array
- **&minVal** and **&maxVal:** Variables to save the minimum and maximum values in **result**
- **&minLoc** and **&maxLoc:** The Point locations of the minimum and maximum values in the
array.
- **Mat():** Optional mask
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java load_image
@end_toggle
-# For the first two methods ( TM_SQDIFF and MT_SQDIFF_NORMED ) the best match are the lowest
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py load_image
@end_toggle
- Create the Trackbar to enter the kind of matching method to be used. When a change is detected
the callback function is called.
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp create_trackbar
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java create_trackbar
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py create_trackbar
@end_toggle
- Let's check out the callback function. First, it makes a copy of the source image:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp copy_source
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java copy_source
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py copy_source
@end_toggle
- Perform the template matching operation. The arguments are naturally the input image **I**,
the template **T**, the result **R** and the match_method (given by the Trackbar),
and optionally the mask image **M**.
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp match_template
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java match_template
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py match_template
@end_toggle
- We normalize the results:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp normalize
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java normalize
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py normalize
@end_toggle
- We localize the minimum and maximum values in the result matrix **R** by using **minMaxLoc()**.
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp best_match
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java best_match
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py best_match
@end_toggle
- For the first two methods ( TM_SQDIFF and MT_SQDIFF_NORMED ) the best match are the lowest
values. For all the others, higher values represent better matches. So, we save the
corresponding value in the **matchLoc** variable:
@code{.cpp}
if( match_method == TM_SQDIFF || match_method == TM_SQDIFF_NORMED )
{ matchLoc = minLoc; }
else
{ matchLoc = maxLoc; }
@endcode
-# Display the source image and the result matrix. Draw a rectangle around the highest possible
matching area:
@code{.cpp}
rectangle( img_display, matchLoc, Point( matchLoc.x + templ.cols , matchLoc.y + templ.rows ), Scalar::all(0), 2, 8, 0 );
rectangle( result, matchLoc, Point( matchLoc.x + templ.cols , matchLoc.y + templ.rows ), Scalar::all(0), 2, 8, 0 );
imshow( image_window, img_display );
imshow( result_window, result );
@endcode
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp match_loc
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java match_loc
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py match_loc
@end_toggle
- Display the source image and the result matrix. Draw a rectangle around the highest possible
matching area:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp imshow
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java imshow
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py imshow
@end_toggle
Results
-------

View File

@ -173,6 +173,8 @@ In this section you will learn about the image processing (manipulation) functio
- @subpage tutorial_template_matching
*Languages:* C++, Java, Python
*Compatibility:* \> OpenCV 2.0
*Author:* Ana Huamán

View File

@ -12,6 +12,7 @@
using namespace std;
using namespace cv;
//! [declare]
/// Global Variables
bool use_mask;
Mat img; Mat templ; Mat mask; Mat result;
@ -20,6 +21,7 @@ const char* result_window = "Result window";
int match_method;
int max_Trackbar = 5;
//! [declare]
/// Function Headers
void MatchingMethod( int, void* );
@ -36,6 +38,7 @@ int main( int argc, char** argv )
return -1;
}
//! [load_image]
/// Load image and template
img = imread( argv[1], IMREAD_COLOR );
templ = imread( argv[2], IMREAD_COLOR );
@ -50,19 +53,26 @@ int main( int argc, char** argv )
cout << "Can't read one of the images" << endl;
return -1;
}
//! [load_image]
//! [create_windows]
/// Create windows
namedWindow( image_window, WINDOW_AUTOSIZE );
namedWindow( result_window, WINDOW_AUTOSIZE );
//! [create_windows]
//! [create_trackbar]
/// Create Trackbar
const char* trackbar_label = "Method: \n 0: SQDIFF \n 1: SQDIFF NORMED \n 2: TM CCORR \n 3: TM CCORR NORMED \n 4: TM COEFF \n 5: TM COEFF NORMED";
createTrackbar( trackbar_label, image_window, &match_method, max_Trackbar, MatchingMethod );
//! [create_trackbar]
MatchingMethod( 0, 0 );
//! [wait_key]
waitKey(0);
return 0;
//! [wait_key]
}
/**
@ -71,44 +81,57 @@ int main( int argc, char** argv )
*/
void MatchingMethod( int, void* )
{
//! [copy_source]
/// Source image to display
Mat img_display;
img.copyTo( img_display );
//! [copy_source]
//! [create_result_matrix]
/// Create the result matrix
int result_cols = img.cols - templ.cols + 1;
int result_rows = img.rows - templ.rows + 1;
result.create( result_rows, result_cols, CV_32FC1 );
//! [create_result_matrix]
//! [match_template]
/// Do the Matching and Normalize
bool method_accepts_mask = (CV_TM_SQDIFF == match_method || match_method == CV_TM_CCORR_NORMED);
if (use_mask && method_accepts_mask)
{ matchTemplate( img, templ, result, match_method, mask); }
else
{ matchTemplate( img, templ, result, match_method); }
//! [match_template]
//! [normalize]
normalize( result, result, 0, 1, NORM_MINMAX, -1, Mat() );
//! [normalize]
//! [best_match]
/// Localizing the best match with minMaxLoc
double minVal; double maxVal; Point minLoc; Point maxLoc;
Point matchLoc;
minMaxLoc( result, &minVal, &maxVal, &minLoc, &maxLoc, Mat() );
//! [best_match]
//! [match_loc]
/// For SQDIFF and SQDIFF_NORMED, the best matches are lower values. For all the other methods, the higher the better
if( match_method == TM_SQDIFF || match_method == TM_SQDIFF_NORMED )
{ matchLoc = minLoc; }
else
{ matchLoc = maxLoc; }
//! [match_loc]
//! [imshow]
/// Show me what you got
rectangle( img_display, matchLoc, Point( matchLoc.x + templ.cols , matchLoc.y + templ.rows ), Scalar::all(0), 2, 8, 0 );
rectangle( result, matchLoc, Point( matchLoc.x + templ.cols , matchLoc.y + templ.rows ), Scalar::all(0), 2, 8, 0 );
imshow( image_window, img_display );
imshow( result_window, result );
//! [imshow]
return;
}

View File

@ -0,0 +1,196 @@
import org.opencv.core.*;
import org.opencv.core.Point;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;
import javax.swing.*;
import javax.swing.event.ChangeEvent;
import javax.swing.event.ChangeListener;
import java.awt.*;
import java.awt.image.BufferedImage;
import java.awt.image.DataBufferByte;
import java.util.*;
class MatchTemplateDemoRun implements ChangeListener{
//! [declare]
/// Global Variables
Boolean use_mask = false;
Mat img = new Mat(), templ = new Mat();
Mat mask = new Mat();
int match_method;
JLabel imgDisplay = new JLabel(), resultDisplay = new JLabel();
//! [declare]
public void run(String[] args) {
if (args.length < 2)
{
System.out.println("Not enough parameters");
System.out.println("Program arguments:\n<image_name> <template_name> [<mask_name>]");
System.exit(-1);
}
//! [load_image]
/// Load image and template
img = Imgcodecs.imread( args[0], Imgcodecs.IMREAD_COLOR );
templ = Imgcodecs.imread( args[1], Imgcodecs.IMREAD_COLOR );
//! [load_image]
if(args.length > 2) {
use_mask = true;
mask = Imgcodecs.imread( args[2], Imgcodecs.IMREAD_COLOR );
}
if(img.empty() || templ.empty() || (use_mask && mask.empty()))
{
System.out.println("Can't read one of the images");
System.exit(-1);
}
matchingMethod();
createJFrame();
}
private void matchingMethod() {
Mat result = new Mat();
//! [copy_source]
/// Source image to display
Mat img_display = new Mat();
img.copyTo( img_display );
//! [copy_source]
//! [create_result_matrix]
/// Create the result matrix
int result_cols = img.cols() - templ.cols() + 1;
int result_rows = img.rows() - templ.rows() + 1;
result.create( result_rows, result_cols, CvType.CV_32FC1 );
//! [create_result_matrix]
//! [match_template]
/// Do the Matching and Normalize
Boolean method_accepts_mask = (Imgproc.TM_SQDIFF == match_method ||
match_method == Imgproc.TM_CCORR_NORMED);
if (use_mask && method_accepts_mask)
{ Imgproc.matchTemplate( img, templ, result, match_method, mask); }
else
{ Imgproc.matchTemplate( img, templ, result, match_method); }
//! [match_template]
//! [normalize]
Core.normalize( result, result, 0, 1, Core.NORM_MINMAX, -1, new Mat() );
//! [normalize]
//! [best_match]
/// Localizing the best match with minMaxLoc
double minVal; double maxVal;
Point matchLoc;
Core.MinMaxLocResult mmr = Core.minMaxLoc( result );
//! [best_match]
//! [match_loc]
/// For SQDIFF and SQDIFF_NORMED, the best matches are lower values.
// For all the other methods, the higher the better
if( match_method == Imgproc.TM_SQDIFF || match_method == Imgproc.TM_SQDIFF_NORMED )
{ matchLoc = mmr.minLoc; }
else
{ matchLoc = mmr.maxLoc; }
//! [match_loc]
//! [imshow]
/// Show me what you got
Imgproc.rectangle(img_display, matchLoc, new Point(matchLoc.x + templ.cols(),
matchLoc.y + templ.rows()), new Scalar(0, 0, 0), 2, 8, 0);
Imgproc.rectangle(result, matchLoc, new Point(matchLoc.x + templ.cols(),
matchLoc.y + templ.rows()), new Scalar(0, 0, 0), 2, 8, 0);
Image tmpImg = toBufferedImage(img_display);
ImageIcon icon = new ImageIcon(tmpImg);
imgDisplay.setIcon(icon);
result.convertTo(result, CvType.CV_8UC1, 255.0);
tmpImg = toBufferedImage(result);
icon = new ImageIcon(tmpImg);
resultDisplay.setIcon(icon);
//! [imshow]
}
public void stateChanged(ChangeEvent e) {
JSlider source = (JSlider) e.getSource();
if (!source.getValueIsAdjusting()) {
match_method = (int)source.getValue();
matchingMethod();
}
}
public Image toBufferedImage(Mat m) {
int type = BufferedImage.TYPE_BYTE_GRAY;
if ( m.channels() > 1 ) {
type = BufferedImage.TYPE_3BYTE_BGR;
}
int bufferSize = m.channels()*m.cols()*m.rows();
byte [] b = new byte[bufferSize];
m.get(0,0,b); // get all the pixels
BufferedImage image = new BufferedImage(m.cols(),m.rows(), type);
final byte[] targetPixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
System.arraycopy(b, 0, targetPixels, 0, b.length);
return image;
}
private void createJFrame() {
String title = "Source image; Control; Result image";
JFrame frame = new JFrame(title);
frame.setLayout(new GridLayout(2, 2));
frame.add(imgDisplay);
//! [create_trackbar]
int min = 0, max = 5;
JSlider slider = new JSlider(JSlider.VERTICAL, min, max, match_method);
//! [create_trackbar]
slider.setPaintTicks(true);
slider.setPaintLabels(true);
// Set the spacing for the minor tick mark
slider.setMinorTickSpacing(1);
// Customizing the labels
Hashtable labelTable = new Hashtable();
labelTable.put( new Integer( 0 ), new JLabel("0 - SQDIFF") );
labelTable.put( new Integer( 1 ), new JLabel("1 - SQDIFF NORMED") );
labelTable.put( new Integer( 2 ), new JLabel("2 - TM CCORR") );
labelTable.put( new Integer( 3 ), new JLabel("3 - TM CCORR NORMED") );
labelTable.put( new Integer( 4 ), new JLabel("4 - TM COEFF") );
labelTable.put( new Integer( 5 ), new JLabel("5 - TM COEFF NORMED : (Method)") );
slider.setLabelTable( labelTable );
slider.addChangeListener(this);
frame.add(slider);
frame.add(resultDisplay);
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.pack();
frame.setVisible(true);
}
}
public class MatchTemplateDemo
{
public static void main(String[] args) {
// load the native OpenCV library
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
// run code
new MatchTemplateDemoRun().run(args);
}
}

View File

@ -0,0 +1,96 @@
import sys
import cv2
## [global_variables]
use_mask = False
img = None
templ = None
mask = None
image_window = "Source Image"
result_window = "Result window"
match_method = 0
max_Trackbar = 5
## [global_variables]
def main(argv):
if (len(sys.argv) < 3):
print 'Not enough parameters'
print 'Usage:\nmatch_template_demo.py <image_name> <template_name> [<mask_name>]'
return -1
## [load_image]
global img
global templ
img = cv2.imread(sys.argv[1], cv2.IMREAD_COLOR)
templ = cv2.imread(sys.argv[2], cv2.IMREAD_COLOR)
if (len(sys.argv) > 3):
global use_mask
use_mask = True
global mask
mask = cv2.imread( sys.argv[3], cv2.IMREAD_COLOR )
if ((img is None) or (templ is None) or (use_mask and (mask is None))):
print 'Can\'t read one of the images'
return -1
## [load_image]
## [create_windows]
cv2.namedWindow( image_window, cv2.WINDOW_AUTOSIZE )
cv2.namedWindow( result_window, cv2.WINDOW_AUTOSIZE )
## [create_windows]
## [create_trackbar]
trackbar_label = 'Method: \n 0: SQDIFF \n 1: SQDIFF NORMED \n 2: TM CCORR \n 3: TM CCORR NORMED \n 4: TM COEFF \n 5: TM COEFF NORMED'
cv2.createTrackbar( trackbar_label, image_window, match_method, max_Trackbar, MatchingMethod )
## [create_trackbar]
MatchingMethod(match_method)
## [wait_key]
cv2.waitKey(0)
return 0
## [wait_key]
def MatchingMethod(param):
global match_method
match_method = param
## [copy_source]
img_display = img.copy()
## [copy_source]
## [match_template]
method_accepts_mask = (cv2.TM_SQDIFF == match_method or match_method == cv2.TM_CCORR_NORMED)
if (use_mask and method_accepts_mask):
result = cv2.matchTemplate(img, templ, match_method, None, mask)
else:
result = cv2.matchTemplate(img, templ, match_method)
## [match_template]
## [normalize]
cv2.normalize( result, result, 0, 1, cv2.NORM_MINMAX, -1 )
## [normalize]
## [best_match]
minVal, maxVal, minLoc, maxLoc = cv2.minMaxLoc(result, None)
## [best_match]
## [match_loc]
if (match_method == cv2.TM_SQDIFF or match_method == cv2.TM_SQDIFF_NORMED):
matchLoc = minLoc
else:
matchLoc = maxLoc
## [match_loc]
## [imshow]
cv2.rectangle(img_display, matchLoc, (matchLoc[0] + templ.shape[0], matchLoc[1] + templ.shape[1]), (0,0,0), 2, 8, 0 )
cv2.rectangle(result, matchLoc, (matchLoc[0] + templ.shape[0], matchLoc[1] + templ.shape[1]), (0,0,0), 2, 8, 0 )
cv2.imshow(image_window, img_display)
cv2.imshow(result_window, result)
## [imshow]
pass
if __name__ == "__main__":
main(sys.argv[1:])