org.opencv.ml
public class EM extends Algorithm
The class implements the EM algorithm as described in the beginning of this section. It is inherited from "Algorithm".
Modifier and Type  Field and Description 

static int 
COV_MAT_DEFAULT 
static int 
COV_MAT_DIAGONAL 
static int 
COV_MAT_GENERIC 
static int 
COV_MAT_SPHERICAL 
static int 
DEFAULT_MAX_ITERS 
static int 
DEFAULT_NCLUSTERS 
static int 
START_AUTO_STEP 
static int 
START_E_STEP 
static int 
START_M_STEP 
Constructor and Description 

EM()
The constructor of the class

EM(int nclusters,
int covMatType,
TermCriteria termCrit)
The constructor of the class

Modifier and Type  Method and Description 

void 
clear() 
boolean 
isTrained() 
double[] 
predict(Mat sample)
Returns a likelihood logarithm value and an index of the most probable
mixture component for the given sample.

double[] 
predict(Mat sample,
Mat probs)
Returns a likelihood logarithm value and an index of the most probable
mixture component for the given sample.

boolean 
train(Mat samples)
Estimates the Gaussian mixture parameters from a samples set.

boolean 
train(Mat samples,
Mat logLikelihoods,
Mat labels,
Mat probs)
Estimates the Gaussian mixture parameters from a samples set.

boolean 
trainE(Mat samples,
Mat means0) 
boolean 
trainE(Mat samples,
Mat means0,
Mat covs0,
Mat weights0,
Mat logLikelihoods,
Mat labels,
Mat probs) 
boolean 
trainM(Mat samples,
Mat probs0) 
boolean 
trainM(Mat samples,
Mat probs0,
Mat logLikelihoods,
Mat labels,
Mat probs) 
public static final int COV_MAT_DEFAULT
public static final int COV_MAT_DIAGONAL
public static final int COV_MAT_GENERIC
public static final int COV_MAT_SPHERICAL
public static final int DEFAULT_MAX_ITERS
public static final int DEFAULT_NCLUSTERS
public static final int START_AUTO_STEP
public static final int START_E_STEP
public static final int START_M_STEP
public EM()
The constructor of the class
public EM(int nclusters, int covMatType, TermCriteria termCrit)
The constructor of the class
nclusters
 The number of mixture components in the Gaussian mixture
model. Default value of the parameter is EM.DEFAULT_NCLUSTERS=5
.
Some of EM implementation could determine the optimal number of mixtures
within a specified value range, but that is not the case in ML yet.covMatType
 Constraint on covariance matrices which defines type of
matrices. Possible values are:
covMatType=EM.COV_MAT_DIAGONAL
.
d
for each matrix.
This is most commonly used option yielding good estimation results.
termCrit
 The termination criteria of the EM algorithm. The EM
algorithm can be terminated by the number of iterations termCrit.maxCount
(number of Msteps) or when relative change of likelihood logarithm is less
than termCrit.epsilon
. Default maximum number of iterations is
EM.DEFAULT_MAX_ITERS=100
.public void clear()
public boolean isTrained()
public double[] predict(Mat sample)
Returns a likelihood logarithm value and an index of the most probable mixture component for the given sample.
The method returns a twoelement double
vector. Zero element is
a likelihood logarithm value for the sample. First element is an index of the
most probable mixture component for the given sample.
sample
 A sample for classification. It should be a onechannel matrix
of 1 x dims or dims x 1 size.public double[] predict(Mat sample, Mat probs)
Returns a likelihood logarithm value and an index of the most probable mixture component for the given sample.
The method returns a twoelement double
vector. Zero element is
a likelihood logarithm value for the sample. First element is an index of the
most probable mixture component for the given sample.
sample
 A sample for classification. It should be a onechannel matrix
of 1 x dims or dims x 1 size.probs
 Optional output matrix that contains posterior probabilities of
each component given the sample. It has 1 x nclusters size and
CV_64FC1
type.public boolean train(Mat samples)
Estimates the Gaussian mixture parameters from a samples set.
Three versions of training method differ in the initialization of Gaussian mixture model parameters and start step:
The methods return true
if the Gaussian mixture model was
trained successfully, otherwise it returns false
.
Unlike many of the ML models, EM is an unsupervised learning algorithm and it
does not take responses (class labels or function values) as input. Instead,
it computes the *Maximum Likelihood Estimate* of the Gaussian mixture
parameters from an input sample set, stores all the parameters inside the
structure: p_(i,k) in probs
, a_k in
means
, S_k in covs[k]
, pi_k in
weights
, and optionally computes the output "class label" for
each sample: labels_i=arg max_k(p_(i,k)), i=1..N (indices of the
most probable mixture component for each sample).
The trained model can be used further for prediction, just like any other classifier. The trained model is similar to the "CvNormalBayesClassifier".
samples
 Samples from which the Gaussian mixture model will be
estimated. It should be a onechannel matrix, each row of which is a sample.
If the matrix does not have CV_64F
type it will be converted to
the inner matrix of such type for the further computing.public boolean train(Mat samples, Mat logLikelihoods, Mat labels, Mat probs)
Estimates the Gaussian mixture parameters from a samples set.
Three versions of training method differ in the initialization of Gaussian mixture model parameters and start step:
The methods return true
if the Gaussian mixture model was
trained successfully, otherwise it returns false
.
Unlike many of the ML models, EM is an unsupervised learning algorithm and it
does not take responses (class labels or function values) as input. Instead,
it computes the *Maximum Likelihood Estimate* of the Gaussian mixture
parameters from an input sample set, stores all the parameters inside the
structure: p_(i,k) in probs
, a_k in
means
, S_k in covs[k]
, pi_k in
weights
, and optionally computes the output "class label" for
each sample: labels_i=arg max_k(p_(i,k)), i=1..N (indices of the
most probable mixture component for each sample).
The trained model can be used further for prediction, just like any other classifier. The trained model is similar to the "CvNormalBayesClassifier".
samples
 Samples from which the Gaussian mixture model will be
estimated. It should be a onechannel matrix, each row of which is a sample.
If the matrix does not have CV_64F
type it will be converted to
the inner matrix of such type for the further computing.logLikelihoods
 The optional output matrix that contains a likelihood
logarithm value for each sample. It has nsamples x 1 size and
CV_64FC1
type.labels
 The optional output "class label" for each sample:
labels_i=arg max_k(p_(i,k)), i=1..N (indices of the most probable
mixture component for each sample). It has nsamples x 1 size and
CV_32SC1
type.probs
 The optional output matrix that contains posterior probabilities
of each Gaussian mixture component given the each sample. It has nsamples
x nclusters size and CV_64FC1
type.public boolean trainE(Mat samples, Mat means0, Mat covs0, Mat weights0, Mat logLikelihoods, Mat labels, Mat probs)