3 This code is the applied part of the PhD thesis of [Emilie Devijver](http://www.math.u-psud.fr/~devijver/).
7 The function selmix delivers a multivariate Gaussian mixture in regression model collection.
8 According to the parameter estimation, we can compute classical model selection criterion, as BIC or AIC, or slope heuristic, using the CAPUSHE package.
9 The methodology used is described in 'Model-Based Clustering for High-Dimensional Data. Application to Functional Data.',
10 available at [this location](https://hal.archives-ouvertes.fr/hal-01060063)
14 Regressors, denoted by X (of size n x p) and responses, denoted by Y (of size n x q) are must-have arguments.
16 Optionally, we could add
18 * gamma: weight power in the Lasso penalty (according to Stadler et al., $\gamma \in \{0,1/2,1\}$;
19 * mini: the minimum number of iterations;
20 * maxi: the maximum number of iterations;
21 * tau: the threshold for stopping EM algorithm;
22 * kmin and kmax: the bounds of interesting number of components,
23 * rangmin and rangmax: the bounds of interesting rank values.
30 For index=1, it computes the Lasso-MLE procedure.
31 For index=2, it computes the Lasso-Rank procedure.
33 /!\ Be careful to the current path /!\
37 * phiInit, rhoInit, piInit, gamInit: the initialization of the matrices phi, rho, pi and gamma,
38 * gridLambda: grid of regularization parameters used to select relevant variables (if kmax-kmin=0, it is, if not, it is the last grid of regularization parameters)
39 * A1,A2: indices of variables selected or not selected (matrices of size (p+1) x q x size(gridLambda))
40 * Phi,Rho,Pi: estimations of each parameter thanks to the procedure LassoMLE if compute index=1, and thanks to the procedure LassoRank if computed index=2.