This is a simple classification model assuming that feature vectors from each class are normally distributed (though, not necessarily independently distributed). So, the whole data distribution function is assumed to be a Gaussian mixture, one component per class. Using the training data the algorithm estimates mean vectors and covariance matrices for every class, and then it uses them for prediction.
The method trains the Normal Bayes classifier. It follows the conventions of the generic ``train`` "method" with the following limitations:
* Only ``CV_ROW_SAMPLE`` data layout is supported.
* Input variables are all ordered.
* Output variable is categorical , which means that elements of ``_responses`` must be integer numbers, though the vector may have the ``CV_32FC1`` type.
In addition, there is an ``update`` flag that identifies whether the model should be trained from scratch ( ``update=false`` ) or should be updated using the new training data ( ``update=true`` ).
The method ``predict`` estimates the most probable classes for input vectors. Input vectors (one or more) are stored as rows of the matrix ``samples`` . In case of multiple input vectors, there should be one output vector ``results`` . The predicted class for a single input vector is returned by the method.