public class SVM extends StatModel
Modifier and Type | Field and Description |
---|---|
static int |
C |
static int |
C_SVC |
static int |
CHI2 |
static int |
COEF |
static int |
CUSTOM |
static int |
DEGREE |
static int |
EPS_SVR |
static int |
GAMMA |
static int |
INTER |
static int |
LINEAR |
static int |
NU |
static int |
NU_SVC |
static int |
NU_SVR |
static int |
ONE_CLASS |
static int |
P |
static int |
POLY |
static int |
RBF |
static int |
SIGMOID |
COMPRESSED_INPUT, PREPROCESSED_INPUT, RAW_OUTPUT, UPDATE_MODEL
Modifier | Constructor and Description |
---|---|
protected |
SVM(long addr) |
Modifier and Type | Method and Description |
---|---|
static SVM |
__fromPtr__(long addr) |
static SVM |
create()
Creates empty model.
|
protected void |
finalize() |
double |
getC()
SEE: setC
|
Mat |
getClassWeights()
SEE: setClassWeights
|
double |
getCoef0()
SEE: setCoef0
|
double |
getDecisionFunction(int i,
Mat alpha,
Mat svidx)
Retrieves the decision function
|
static ParamGrid |
getDefaultGridPtr(int param_id)
Generates a grid for %SVM parameters.
|
double |
getDegree()
SEE: setDegree
|
double |
getGamma()
SEE: setGamma
|
int |
getKernelType()
Type of a %SVM kernel.
|
double |
getNu()
SEE: setNu
|
double |
getP()
SEE: setP
|
Mat |
getSupportVectors()
Retrieves all the support vectors
The method returns all the support vectors as a floating-point matrix, where support vectors are
stored as matrix rows.
|
TermCriteria |
getTermCriteria()
SEE: setTermCriteria
|
int |
getType()
SEE: setType
|
Mat |
getUncompressedSupportVectors()
Retrieves all the uncompressed support vectors of a linear %SVM
The method returns all the uncompressed support vectors of a linear %SVM that the compressed
support vector, used for prediction, was derived from.
|
static SVM |
load(String filepath)
Loads and creates a serialized svm from a file
Use SVM::save to serialize and store an SVM to disk.
|
void |
setC(double val)
getC SEE: getC
|
void |
setClassWeights(Mat val)
getClassWeights SEE: getClassWeights
|
void |
setCoef0(double val)
getCoef0 SEE: getCoef0
|
void |
setDegree(double val)
getDegree SEE: getDegree
|
void |
setGamma(double val)
getGamma SEE: getGamma
|
void |
setKernel(int kernelType)
Initialize with one of predefined kernels.
|
void |
setNu(double val)
getNu SEE: getNu
|
void |
setP(double val)
getP SEE: getP
|
void |
setTermCriteria(TermCriteria val)
getTermCriteria SEE: getTermCriteria
|
void |
setType(int val)
getType SEE: getType
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold,
ParamGrid Cgrid)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold,
ParamGrid Cgrid,
ParamGrid gammaGrid)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold,
ParamGrid Cgrid,
ParamGrid gammaGrid,
ParamGrid pGrid)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold,
ParamGrid Cgrid,
ParamGrid gammaGrid,
ParamGrid pGrid,
ParamGrid nuGrid)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold,
ParamGrid Cgrid,
ParamGrid gammaGrid,
ParamGrid pGrid,
ParamGrid nuGrid,
ParamGrid coeffGrid)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold,
ParamGrid Cgrid,
ParamGrid gammaGrid,
ParamGrid pGrid,
ParamGrid nuGrid,
ParamGrid coeffGrid,
ParamGrid degreeGrid)
Trains an %SVM with optimal parameters
|
boolean |
trainAuto(Mat samples,
int layout,
Mat responses,
int kFold,
ParamGrid Cgrid,
ParamGrid gammaGrid,
ParamGrid pGrid,
ParamGrid nuGrid,
ParamGrid coeffGrid,
ParamGrid degreeGrid,
boolean balanced)
Trains an %SVM with optimal parameters
|
calcError, empty, getVarCount, isClassifier, isTrained, predict, predict, predict, train, train, train
clear, getDefaultName, getNativeObjAddr, save
public static final int CUSTOM
public static final int LINEAR
public static final int POLY
public static final int RBF
public static final int SIGMOID
public static final int CHI2
public static final int INTER
public static final int C_SVC
public static final int NU_SVC
public static final int ONE_CLASS
public static final int EPS_SVR
public static final int NU_SVR
public static final int C
public static final int GAMMA
public static final int P
public static final int NU
public static final int COEF
public static final int DEGREE
public static SVM __fromPtr__(long addr)
public Mat getClassWeights()
public Mat getSupportVectors()
public Mat getUncompressedSupportVectors()
public static ParamGrid getDefaultGridPtr(int param_id)
param_id
- %SVM parameters IDs that must be one of the SVM::ParamTypes. The grid is
generated for the parameter with this ID.
The function generates a grid pointer for the specified parameter of the %SVM algorithm.
The grid may be passed to the function SVM::trainAuto.public static SVM create()
public static SVM load(String filepath)
filepath
- path to serialized svmpublic TermCriteria getTermCriteria()
public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold, ParamGrid Cgrid, ParamGrid gammaGrid, ParamGrid pGrid, ParamGrid nuGrid, ParamGrid coeffGrid, ParamGrid degreeGrid, boolean balanced)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm isCgrid
- grid for CgammaGrid
- grid for gammapGrid
- grid for pnuGrid
- grid for nucoeffGrid
- grid for coeffdegreeGrid
- grid for degreebalanced
- If true and the problem is 2-class classification then the method creates more
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold, ParamGrid Cgrid, ParamGrid gammaGrid, ParamGrid pGrid, ParamGrid nuGrid, ParamGrid coeffGrid, ParamGrid degreeGrid)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm isCgrid
- grid for CgammaGrid
- grid for gammapGrid
- grid for pnuGrid
- grid for nucoeffGrid
- grid for coeffdegreeGrid
- grid for degree
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold, ParamGrid Cgrid, ParamGrid gammaGrid, ParamGrid pGrid, ParamGrid nuGrid, ParamGrid coeffGrid)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm isCgrid
- grid for CgammaGrid
- grid for gammapGrid
- grid for pnuGrid
- grid for nucoeffGrid
- grid for coeff
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold, ParamGrid Cgrid, ParamGrid gammaGrid, ParamGrid pGrid, ParamGrid nuGrid)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm isCgrid
- grid for CgammaGrid
- grid for gammapGrid
- grid for pnuGrid
- grid for nu
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold, ParamGrid Cgrid, ParamGrid gammaGrid, ParamGrid pGrid)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm isCgrid
- grid for CgammaGrid
- grid for gammapGrid
- grid for p
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold, ParamGrid Cgrid, ParamGrid gammaGrid)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm isCgrid
- grid for CgammaGrid
- grid for gamma
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold, ParamGrid Cgrid)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm isCgrid
- grid for C
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses, int kFold)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.kFold
- Cross-validation parameter. The training set is divided into kFold subsets. One
subset is used to test the model, the others form the train set. So, the %SVM algorithm is
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public boolean trainAuto(Mat samples, int layout, Mat responses)
samples
- training sampleslayout
- See ml::SampleTypes.responses
- vector of responses associated with the training samples.
subset is used to test the model, the others form the train set. So, the %SVM algorithm is
balanced cross-validation subsets that is proportions between classes in subsets are close
to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p,
nu, coef0, degree. Parameters are considered optimal when the cross-validation
estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only
offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the
regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and
the usual %SVM with parameters specified in params is executed.public double getC()
public double getCoef0()
public double getDecisionFunction(int i, Mat alpha, Mat svidx)
i
- the index of the decision function. If the problem solved is regression, 1-class or
2-class classification, then there will be just one decision function and the index should
always be 0. Otherwise, in the case of N-class classification, there will be \(N(N-1)/2\)
decision functions.alpha
- the optional output vector for weights, corresponding to different support vectors.
In the case of linear %SVM all the alpha's will be 1's.svidx
- the optional output vector of indices of support vectors within the matrix of
support vectors (which can be retrieved by SVM::getSupportVectors). In the case of linear
%SVM each decision function consists of a single "compressed" support vector.
The method returns rho parameter of the decision function, a scalar subtracted from the weighted
sum of kernel responses.public double getDegree()
public double getGamma()
public double getNu()
public double getP()
public int getKernelType()
public int getType()
public void setC(double val)
val
- automatically generatedpublic void setClassWeights(Mat val)
val
- automatically generatedpublic void setCoef0(double val)
val
- automatically generatedpublic void setDegree(double val)
val
- automatically generatedpublic void setGamma(double val)
val
- automatically generatedpublic void setKernel(int kernelType)
kernelType
- automatically generatedpublic void setNu(double val)
val
- automatically generatedpublic void setP(double val)
val
- automatically generatedpublic void setTermCriteria(TermCriteria val)
val
- automatically generatedpublic void setType(int val)
val
- automatically generatedGenerated on Wed Oct 9 2019 23:24:43 UTC / OpenCV 4.1.2