This simple classification model assumes that feature vectors from each class are normally distributed (though, not necessarily independently distributed). So, the whole data distribution function is assumed to be a Gaussian mixture, one component per class. Using the training data the algorithm estimates mean vectors and covariance matrices for every class, and then it uses them for prediction.
[Fukunaga90] |
|
Bayes classifier for normally distributed data.
Creates empty model
Parameters: |
|
---|
Use StatModel::train to train the model, StatModel::train<NormalBayesClassifier>(traindata, params) to create and train the model, StatModel::load<NormalBayesClassifier>(filename) to load the pre-trained model.
Predicts the response for sample(s).
The method estimates the most probable classes for input vectors. Input vectors (one or more) are stored as rows of the matrix inputs. In case of multiple input vectors, there should be one output vector outputs. The predicted class for a single input vector is returned by the method. The vector outputProbs contains the output probabilities corresponding to each element of result.