### Table of Contents

gmm.h is an implementation of *Gaussian Mixture Models* (GMMs). The main functionality provided by this module is learning GMMs from data by maximum likelihood. Model optimization uses the Expectation Maximization (EM) algorithm [6] . The implementation supports `float`

or `double`

data types, is parallelized, and is tuned to work reliably and effectively on datasets of visual features. Stability is obtained in part by regularizing and restricting the parameters of the GMM.

Getting started demonstreates how to use the C API to compute the FV representation of an image. For further details refer to:

# Getting started

In order to use gmm.h to learn a GMM from training data, create a new VlGMM object instance, set the parameters as desired, and run the training code. The following example learns `numClusters`

Gaussian components from `numData`

vectors of dimension `dimension`

and storage class `float`

using at most 100 EM iterations:

- Note
- VlGMM assumes that the covariance matrices of the GMM are diagonal. This reduces significantly the number of parameters to learn and is usually an acceptable compromise in vision applications. If the data is significantly correlated, it can be beneficial to de-correlate it by PCA rotation or projection in pre-processing.

vl_gmm_get_loglikelihood is used to get the final loglikelihood of the estimated mixture, vl_gmm_get_means and vl_gmm_get_covariances to obtain the means and the diagonals of the covariance matrices of the estimated Gaussian modes, and vl_gmm_get_posteriors to get the posterior probabilities that a given point is associated to each of the modes (soft assignments).

The learning algorithm, which uses EM, finds a local optimum of the objective function. Therefore the initialization is crucial in obtaining a good model, measured in term of the final loglikelihood. VlGMM supports a few methods (use vl_gmm_set_initialization to choose one) as follows:

Method | VlGMMInitialization enumeration | Description |
---|---|---|

Random initialization | VlGMMRand | Random initialization of the mixture parameters |

KMeans | VlGMMKMeans | Initialization of the mixture parameters using VlKMeans |

Custom | VlGMMCustom | User specified initialization |

Note that in the case of VlGMMKMeans initialization, an object of type VlKMeans object must be created and passed to the VlGMM instance (see K-means clustering to see how to correctly set up this object).

When a user wants to use the VlGMMCustom method, the initial means, covariances and priors have to be specified using the vl_gmm_set_means, vl_gmm_set_covariances and vl_gmm_set_priors methods.