OpenCV  4.10.0
Open Source Computer Vision
Loading...
Searching...
No Matches
Namespaces | Classes | Typedefs | Functions
Tracking API implementation details

Detailed Description

Long-term optical tracking API

Long-term optical tracking is an important issue for many computer vision applications in real world scenario. The development in this area is very fragmented and this API is an unique interface useful for plug several algorithms and compare them. This work is partially based on [231] and [163] .

These algorithms start from a bounding box of the target and with their internal representation they avoid the drift during the tracking. These long-term trackers are able to evaluate online the quality of the location of the target in the new frame, without ground truth.

There are three main components: the TrackerContribSampler, the TrackerContribFeatureSet and the TrackerModel. The first component is the object that computes the patches over the frame based on the last target location. The TrackerContribFeatureSet is the class that manages the Features, is possible plug many kind of these (HAAR, HOG, LBP, Feature2D, etc). The last component is the internal representation of the target, it is the appearance model. It stores all state candidates and compute the trajectory (the most likely target states). The class TrackerTargetState represents a possible state of the target. The TrackerContribSampler and the TrackerContribFeatureSet are the visual representation of the target, instead the TrackerModel is the statistical model.

A recent benchmark between these algorithms can be found in [301]

Creating Your Own %Tracker

If you want to create a new tracker, here's what you have to do. First, decide on the name of the class for the tracker (to meet the existing style, we suggest something with prefix "tracker", e.g. trackerMIL, trackerBoosting) – we shall refer to this choice as to "classname" in subsequent.

Every tracker has three component TrackerContribSampler, TrackerContribFeatureSet and TrackerModel. The first two are instantiated from Tracker base class, instead the last component is abstract, so you must implement your TrackerModel.

TrackerContribSampler

TrackerContribSampler is already instantiated, but you should define the sampling algorithm and add the classes (or single class) to TrackerContribSampler. You can choose one of the ready implementation as TrackerContribSamplerCSC or you can implement your sampling method, in this case the class must inherit TrackerContribSamplerAlgorithm. Fill the samplingImpl method that writes the result in "sample" output argument.

Example of creating specialized TrackerContribSamplerAlgorithm TrackerContribSamplerCSC : :

class CV_EXPORTS_W TrackerContribSamplerCSC : public TrackerContribSamplerAlgorithm
{
public:
TrackerContribSamplerCSC( const TrackerContribSamplerCSC::Params &parameters = TrackerContribSamplerCSC::Params() );
~TrackerContribSamplerCSC();
...
protected:
bool samplingImpl( const Mat& image, Rect boundingBox, std::vector<Mat>& sample );
...
};

Example of adding TrackerContribSamplerAlgorithm to TrackerContribSampler : :

//sampler is the TrackerContribSampler
Ptr<TrackerContribSamplerAlgorithm> CSCSampler = new TrackerContribSamplerCSC( CSCparameters );
if( !sampler->addTrackerSamplerAlgorithm( CSCSampler ) )
return false;
//or add CSC sampler with default parameters
//sampler->addTrackerSamplerAlgorithm( "CSC" );
See also
TrackerContribSamplerCSC, TrackerContribSamplerAlgorithm

TrackerContribFeatureSet

TrackerContribFeatureSet is already instantiated (as first) , but you should define what kinds of features you'll use in your tracker. You can use multiple feature types, so you can add a ready implementation as TrackerContribFeatureHAAR in your TrackerContribFeatureSet or develop your own implementation. In this case, in the computeImpl method put the code that extract the features and in the selection method optionally put the code for the refinement and selection of the features.

Example of creating specialized TrackerFeature TrackerContribFeatureHAAR : :

class CV_EXPORTS_W TrackerContribFeatureHAAR : public TrackerFeature
{
public:
TrackerContribFeatureHAAR( const TrackerContribFeatureHAAR::Params &parameters = TrackerContribFeatureHAAR::Params() );
~TrackerContribFeatureHAAR();
void selection( Mat& response, int npoints );
...
protected:
bool computeImpl( const std::vector<Mat>& images, Mat& response );
...
};

Example of adding TrackerFeature to TrackerContribFeatureSet : :

//featureSet is the TrackerContribFeatureSet
Ptr<TrackerFeature> trackerFeature = new TrackerContribFeatureHAAR( HAARparameters );
featureSet->addTrackerFeature( trackerFeature );
See also
TrackerContribFeatureHAAR, TrackerContribFeatureSet

TrackerModel

TrackerModel is abstract, so in your implementation you must develop your TrackerModel that inherit from TrackerModel. Fill the method for the estimation of the state "modelEstimationImpl", that estimates the most likely target location, see [231] table I (ME) for further information. Fill "modelUpdateImpl" in order to update the model, see [231] table I (MU). In this class you can use the :cConfidenceMap and :cTrajectory to storing the model. The first represents the model on the all possible candidate states and the second represents the list of all estimated states.

Example of creating specialized TrackerModel TrackerMILModel : :

class TrackerMILModel : public TrackerModel
{
public:
TrackerMILModel( const Rect& boundingBox );
~TrackerMILModel();
...
protected:
void modelEstimationImpl( const std::vector<Mat>& responses );
void modelUpdateImpl();
...
};

And add it in your Tracker : :

bool TrackerMIL::initImpl( const Mat& image, const Rect2d& boundingBox )
{
...
//model is the general TrackerModel field of the general Tracker
model = new TrackerMILModel( boundingBox );
...
}
Rect_< double > Rect2d
Definition types.hpp:488

In the last step you should define the TrackerStateEstimator based on your implementation or you can use one of ready class as TrackerStateEstimatorMILBoosting. It represent the statistical part of the model that estimates the most likely target state.

Example of creating specialized TrackerStateEstimator TrackerStateEstimatorMILBoosting : :

class CV_EXPORTS_W TrackerStateEstimatorMILBoosting : public TrackerStateEstimator
{
class TrackerMILTargetState : public TrackerTargetState
{
...
};
public:
TrackerStateEstimatorMILBoosting( int nFeatures = 250 );
~TrackerStateEstimatorMILBoosting();
...
protected:
Ptr<TrackerTargetState> estimateImpl( const std::vector<ConfidenceMap>& confidenceMaps );
void updateImpl( std::vector<ConfidenceMap>& confidenceMaps );
...
};

And add it in your TrackerModel : :

//model is the TrackerModel of your Tracker
Ptr<TrackerStateEstimatorMILBoosting> stateEstimator = new TrackerStateEstimatorMILBoosting( params.featureSetNumFeatures );
model->setTrackerStateEstimator( stateEstimator );
See also
TrackerModel, TrackerStateEstimatorMILBoosting, TrackerTargetState

During this step, you should define your TrackerTargetState based on your implementation. TrackerTargetState base class has only the bounding box (upper-left position, width and height), you can enrich it adding scale factor, target rotation, etc.

Example of creating specialized TrackerTargetState TrackerMILTargetState : :

class TrackerMILTargetState : public TrackerTargetState
{
public:
TrackerMILTargetState( const Point2f& position, int targetWidth, int targetHeight, bool foreground, const Mat& features );
~TrackerMILTargetState();
...
private:
bool isTarget;
Mat targetFeatures;
...
};

Namespaces

namespace  cv::detail::tracking::contrib_feature
 
namespace  cv::detail::tracking::kalman_filters
 
namespace  cv::detail::tracking::online_boosting
 
namespace  cv::detail::tracking::tbm
 
namespace  cv::detail::tracking::tld
 

Classes

class  cv::detail::tracking::TrackerContribFeature
 Abstract base class for TrackerContribFeature that represents the feature. More...
 
class  cv::detail::tracking::TrackerContribFeatureHAAR
 TrackerContribFeature based on HAAR features, used by TrackerMIL and many others algorithms. More...
 
class  cv::detail::tracking::TrackerContribFeatureSet
 Class that manages the extraction and selection of features. More...
 
class  cv::detail::tracking::TrackerContribSampler
 Class that manages the sampler in order to select regions for the update the model of the tracker [AAM] Sampling e Labeling. See table I and section III B. More...
 
class  cv::detail::tracking::TrackerContribSamplerAlgorithm
 Abstract base class for TrackerContribSamplerAlgorithm that represents the algorithm for the specific sampler. More...
 
class  cv::detail::tracking::TrackerContribSamplerCSC
 TrackerSampler based on CSC (current state centered), used by MIL algorithm TrackerMIL. More...
 
class  cv::detail::tracking::TrackerFeature
 Abstract base class for TrackerFeature that represents the feature. More...
 
class  cv::detail::tracking::TrackerFeatureFeature2d
 TrackerContribFeature based on Feature2D. More...
 
class  cv::detail::tracking::TrackerFeatureHOG
 TrackerContribFeature based on HOG. More...
 
class  cv::detail::tracking::TrackerFeatureLBP
 TrackerContribFeature based on LBP. More...
 
class  cv::detail::tracking::TrackerFeatureSet
 Class that manages the extraction and selection of features. More...
 
class  cv::detail::tracking::TrackerModel
 Abstract class that represents the model of the target. More...
 
class  cv::detail::tracking::TrackerSampler
 Class that manages the sampler in order to select regions for the update the model of the tracker [AAM] Sampling e Labeling. See table I and section III B. More...
 
class  cv::detail::tracking::TrackerSamplerAlgorithm
 Abstract base class for TrackerSamplerAlgorithm that represents the algorithm for the specific sampler. More...
 
class  cv::detail::tracking::TrackerSamplerCS
 TrackerContribSampler based on CS (current state), used by algorithm TrackerBoosting. More...
 
class  cv::detail::tracking::TrackerSamplerCSC
 TrackerSampler based on CSC (current state centered), used by MIL algorithm TrackerMIL. More...
 
class  cv::detail::tracking::TrackerSamplerPF
 This sampler is based on particle filtering. More...
 
class  cv::detail::tracking::TrackerStateEstimator
 Abstract base class for TrackerStateEstimator that estimates the most likely target state. More...
 
class  cv::detail::tracking::TrackerStateEstimatorAdaBoosting
 TrackerStateEstimatorAdaBoosting based on ADA-Boosting. More...
 
class  cv::detail::tracking::TrackerStateEstimatorSVM
 TrackerStateEstimator based on SVM. More...
 
class  cv::detail::tracking::TrackerTargetState
 Abstract base class for TrackerTargetState that represents a possible state of the target. More...
 

Typedefs

typedef std::vector< std::pair< Ptr< TrackerTargetState >, float > > cv::detail::tracking::ConfidenceMap
 Represents the model of the target at frame \(k\) (all states and scores)
 
typedef std::vector< Ptr< TrackerTargetState > > cv::detail::tracking::Trajectory
 Represents the estimate states for all frames.
 

Functions

void cv::detail::tracking::computeInteractionMatrix (const cv::Mat &uv, const cv::Mat &depths, const cv::Mat &K, cv::Mat &J)
 Compute the interaction matrix ( [132] [52] [53] ) for a set of 2D pixels. This is usually used in visual servoing applications to command a robot to move at desired pixel locations/velocities. By inverting this matrix, one can estimate camera spatial velocity i.e., the twist.
 
cv::Vec6d cv::detail::tracking::computeTwist (const cv::Mat &uv, const cv::Mat &duv, const cv::Mat &depths, const cv::Mat &K)
 Compute the camera twist from a set of 2D pixel locations, their velocities, depth values and intrinsic parameters of the camera. The pixel velocities are usually obtained from optical flow algorithms, both dense and sparse flow can be used to compute the flow between images and duv computed by dividing the flow by the time interval between the images.
 

Typedef Documentation

◆ ConfidenceMap

typedef std::vector<std::pair<Ptr<TrackerTargetState>, float> > cv::detail::tracking::ConfidenceMap

#include <opencv2/video/detail/tracking.detail.hpp>

Represents the model of the target at frame \(k\) (all states and scores)

See [231] The set of the pair \(\langle \hat{x}^{i}_{k}, C^{i}_{k} \rangle\)

See also
TrackerTargetState

◆ Trajectory

#include <opencv2/video/detail/tracking.detail.hpp>

Represents the estimate states for all frames.

[231] \(x_{k}\) is the trajectory of the target up to time \(k\)

See also
TrackerTargetState

Function Documentation

◆ computeInteractionMatrix()

void cv::detail::tracking::computeInteractionMatrix ( const cv::Mat uv,
const cv::Mat depths,
const cv::Mat K,
cv::Mat J 
)

#include <opencv2/tracking/twist.hpp>

Compute the interaction matrix ( [132] [52] [53] ) for a set of 2D pixels. This is usually used in visual servoing applications to command a robot to move at desired pixel locations/velocities. By inverting this matrix, one can estimate camera spatial velocity i.e., the twist.

Parameters
uv2xN matrix of 2D pixel locations
depths1xN matrix of depth values
K3x3 camera intrinsic matrix
J2Nx6 interaction matrix

◆ computeTwist()

cv::Vec6d cv::detail::tracking::computeTwist ( const cv::Mat uv,
const cv::Mat duv,
const cv::Mat depths,
const cv::Mat K 
)

#include <opencv2/tracking/twist.hpp>

Compute the camera twist from a set of 2D pixel locations, their velocities, depth values and intrinsic parameters of the camera. The pixel velocities are usually obtained from optical flow algorithms, both dense and sparse flow can be used to compute the flow between images and duv computed by dividing the flow by the time interval between the images.

Parameters
uv2xN matrix of 2D pixel locations
duv2Nx1 matrix of 2D pixel velocities
depths1xN matrix of depth values
K3x3 camera intrinsic matrix
Returns
cv::Vec6d 6x1 camera twist